# 6.4 Semidefinite Optimization¶

Semidefinite optimization is a generalization of conic quadratic optimization, allowing the use of matrix variables belonging to the convex cone of positive semidefinite matrices

$\PSD^r = \left\lbrace X \in \Symm^r: z^T X z \geq 0, \quad \forall z \in \real^r \right\rbrace,$

where $$\Symm^r$$ is the set of $$r \times r$$ real-valued symmetric matrices.

MOSEK can solve semidefinite optimization problems of the form

$\begin{split}\begin{array}{lccccll} \mbox{minimize} & & & \sum_{j=0}^{n-1} c_j x_j + \sum_{j=0}^{p-1} \left\langle \barC_j, \barX_j \right\rangle + c^f & & &\\ \mbox{subject to} & l_i^c & \leq & \sum_{j=0}^{n-1} a_{ij} x_j + \sum_{j=0}^{p-1} \left\langle \barA_{ij}, \barX_j \right\rangle & \leq & u_i^c, & i = 0, \ldots, m-1,\\ & l_j^x & \leq & x_j & \leq & u_j^x, & j = 0, \ldots, n-1,\\ & & & x \in \K, \barX_j \in \PSD^{r_j}, & & & j = 0, \ldots, p-1 \end{array}\end{split}$

where the problem has $$p$$ symmetric positive semidefinite variables $$\barX_j\in \PSD^{r_j}$$ of dimension $$r_j$$ with symmetric coefficient matrices $$\barC_j\in \Symm^{r_j}$$ and $$\barA_{i,j}\in \Symm^{r_j}$$. We use standard notation for the matrix inner product, i.e., for $$A,B\in \real^{m\times n}$$ we have

$\left\langle A,B \right\rangle := \sum_{i=0}^{m-1} \sum_{j=0}^{n-1} A_{ij} B_{ij}.$

## 6.4.1 Example SDO1¶

We consider the simple optimization problem with semidefinite and conic quadratic constraints:

(1)$\begin{split}\begin{array} {llcc} \mbox{minimize} & \left\langle \left[ \begin{array} {ccc} 2 & 1 & 0 \\ 1 & 2 & 1 \\ 0 & 1 & 2 \end{array} \right], \barX \right\rangle + x_0 & & \\ \mbox{subject to} & \left\langle \left[ \begin{array} {ccc} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{array} \right], \barX \right\rangle + x_0 & = & 1, \\ & \left\langle \left[ \begin{array}{ccc} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{array} \right], \barX \right\rangle + x_1 + x_2 & = & 1/2, \\ & x_0 \geq \sqrt{{x_1}^2 + {x_2}^2}, & \barX \succeq 0, & \end{array}\end{split}$

The problem description contains a 3-dimensional symmetric semidefinite variable which can be written explicitly as:

$\begin{split}\barX = \left[ \begin{array} {ccc} \barX_{00} & \barX_{10} & \barX_{20} \\ \barX_{10} & \barX_{11} & \barX_{21} \\ \barX_{20} & \barX_{21} & \barX_{22} \end{array} \right] \in \PSD^3,\end{split}$

and a conic quadratic variable $$(x_0, x_1, x_2) \in \Q^3$$. The objective is to minimize

$2(\barX_{00} + \barX_{10} + \barX_{11} + \barX_{21} + \barX_{22}) + x_0,$

subject to the two linear constraints

$\begin{split}\begin{array}{ccc} \barX_{00} + \barX_{11} + \barX_{22} + x_0 & = & 1, \\ \barX_{00} + \barX_{11} + \barX_{22} + 2(\barX_{10} + \barX_{20} + \barX_{21}) + x_1 + x_2 & = & 1/2. \end{array}\end{split}$

Setting up the linear and quadratic part

The linear and quadratic parts (constraints, variables, objective, cones) are set up using the methods described in the relevant tutorials; Sec. 6.1 (Linear Optimization) and Sec. 6.3 (Conic Quadratic Optimization). Here we only discuss the aspects directly involving semidefinite variables.

Appending semidefinite variables

First, we need to declare the number of semidefinite variables in the problem, similarly to the number of linear variables and constraints. This is done with the function Task.appendbarvars.

            task.appendbarvars(BARVARDIM)


Appending coefficient matrices

Coefficient matrices $$\barC_j$$ and $$\barA_{ij}$$ are constructed as weighted combinations of sparse symmetric matrices previously appended with the function Task.appendsparsesymmat.

            symc = task.appendsparsesymmat(BARVARDIM[0],
barci,
barcj,
barcval)

barai[0],
baraj[0],
baraval[0])

barai[1],
baraj[1],
baraval[1])



The arguments specify the dimension of the symmetric matrix, followed by its description in the sparse triplet format. Only lower-triangular entries should be included. The function produces a unique index of the matrix just entered in the collection of all coefficient matrices defined by the user.

After one or more symmetric matrices have been created using Task.appendsparsesymmat, we can combine them to set up the objective matrix coefficient $$\barC_j$$ using Task.putbarcj, which forms a linear combination of one or more symmetric matrices. In this example we form the objective matrix directly, i.e. as a weighted combination of a single symmetric matrix.

            task.putbarcj(0, [symc], [1.0])


Similarly, a constraint matrix coefficient $$\barA_{ij}$$ is set up by the function Task.putbaraij.

            task.putbaraij(0, 0, [syma0], [1.0])


Retrieving the solution

After the problem is solved, we read the solution using Task.getbarxj:

                task.getbarxj(mosek.soltype.itr, 0, barx)


The function returns the half-vectorization of $$\barX_j$$ (the lower triangular part stacked as a column vector), where the semidefinite variable index $$j$$ is passed as an argument.

Source code

Listing 8 Source code solving problem (1). Click here to download.
import sys
import mosek

# Since the value of infinity is ignored, we define it solely
# for symbolic purposes
inf = 0.0

# Define a stream printer to grab output from MOSEK
def streamprinter(text):
sys.stdout.write(text)
sys.stdout.flush()

def main():
# Make mosek environment
with mosek.Env() as env:

# Create a task object and attach log stream printer

# Bound keys for constraints
bkc = [mosek.boundkey.fx,
mosek.boundkey.fx]

# Bound values for constraints
blc = [1.0, 0.5]
buc = [1.0, 0.5]

# Below is the sparse representation of the A
# matrix stored by row.
asub = [[0],
[1, 2]]
aval = [[1.0],
[1.0, 1.0]]

conesub = [0, 1, 2]

barci = [0, 1, 1, 2, 2]
barcj = [0, 0, 1, 1, 2]
barcval = [2.0, 1.0, 2.0, 1.0, 2.0]

barai = [[0, 1, 2],
[0, 1, 2, 1, 2, 2]]
baraj = [[0, 1, 2],
[0, 0, 0, 1, 1, 2]]
baraval = [[1.0, 1.0, 1.0],
[1.0, 1.0, 1.0, 1.0, 1.0, 1.0]]

numvar = 3
numcon = len(bkc)
BARVARDIM = [3]

# Append 'numvar' variables.
# The variables will initially be fixed at zero (x=0).

# Append 'numcon' empty constraints.
# The constraints will initially have no bounds.

# Append matrix variables of sizes in 'BARVARDIM'.
# The variables will initially be fixed at zero.

# Set the linear term c_0 in the objective.

for j in range(numvar):
# Set the bounds on variable j
# blx[j] <= x_j <= bux[j]

for i in range(numcon):
# Set the bounds on constraints.
# blc[i] <= constraint_i <= buc[i]

# Input row i of A
# Column index of non-zeros in constraint j.
asub[i],
aval[i])            # Non-zero values of row j.

0.0,
conesub)

barci,
barcj,
barcval)

barai[0],
baraj[0],
baraval[0])

barai[1],
baraj[1],
baraval[1])

# Input the objective sense (minimize/maximize)

# Solve the problem and print summary

# Get status information about the solution

if (solsta == mosek.solsta.optimal or
solsta == mosek.solsta.near_optimal):
xx = [0.] * numvar

lenbarvar = BARVARDIM[0] * (BARVARDIM[0] + 1) / 2
barx = [0.] * int(lenbarvar)

print("Optimal solution:\nx=%s\nbarx=%s" % (xx, barx))
elif (solsta == mosek.solsta.dual_infeas_cer or
solsta == mosek.solsta.prim_infeas_cer or
solsta == mosek.solsta.near_dual_infeas_cer or
solsta == mosek.solsta.near_prim_infeas_cer):
print("Primal or dual infeasibility certificate found.\n")
elif solsta == mosek.solsta.unknown:
print("Unknown solution status")
else:
print("Other solution status")

# call the main function
try:
main()
except mosek.MosekException as e:
print("ERROR: %s" % str(e.errno))
if e.msg is not None:
print("\t%s" % e.msg)
sys.exit(1)
except:
import traceback
traceback.print_exc()
sys.exit(1)