11.10 Semidefinite Relaxation of MIQCQO Problems¶
In this case study we will discuss a fairly common application for Semidefinite Optimization: to define a continuous semidefinite relaxation of a mixed-integer quadratic optimization problem. This section is based on the method by Park and Boyd [PB15].
We will focus on problems of the form:
where
integer least squares: minimize
subject to ,closest vector problem: minimize
subject to .
Following [PB15], we can derive a relaxed continuous model. We first relax the integrality constraint
The last constraint is still non-convex. We introduce a new variable
To get a conic problem we relax the last constraint and apply the Schur complement. The final relaxation follows:
Fusion Implementation
Implementing model (11.41) in Fusion is very simple. We assume the input
M = Model()
The important step is to define a single PSD variable
Our code will create
Z = M.variable("Z", Domain.inPSDCone(n+1))
X = Z[:-1, :-1]
x = Z[:-1, -1:]
Then we define the constraints:
M.constraint( X.diag() >= x )
M.constraint( Z[n, n] == 1 )
The objective function uses several available linear expressions:
M.objective( ObjectiveSense.Minimize,
Expr.dot( P, X ) + 2.0 * Expr.dot(x, q) )
Note that the trace operator is not directly available in Fusion, but it can easily be defined from scratch.
Complete code
def miqcqp_sdo_relaxation(n,P,q):
M = Model()
M.setLogHandler(sys.stdout)
Z = M.variable("Z", Domain.inPSDCone(n+1))
X = Z[:-1, :-1]
x = Z[:-1, -1:]
M.constraint( X.diag() >= x )
M.constraint( Z[n, n] == 1 )
M.objective( ObjectiveSense.Minimize,
Expr.dot( P, X ) + 2.0 * Expr.dot(x, q) )
return M
Numerical Examples
We present now some simple numerical experiments for the integer least squares problem:
It corresponds to the problem (11.40) with
An integer rounding xRound
of the solution to (11.41) is a feasible integer solution to (11.42). We can compare it to the actual optimal integer solution xOpt
, whenever the latter is available. Of course it is very simple to formulate the integer least squares problem in Fusion:
def int_least_squares(n, A, b):
M = Model()
M.setLogHandler(sys.stdout)
x = M.variable("x", n, Domain.integral(Domain.unbounded()))
t = M.variable("t", 1, Domain.unbounded())
M.constraint( Expr.vstack(t, A @ x - b), Domain.inQCone() )
M.objective( ObjectiveSense.Minimize, t )
return M
All that remains is to compare the values of the objective function
# problem dimensions
n = 20
m = 2*n
# problem data
A = numpy.reshape(numpy.random.normal(0., 1.0, n*m), (m,n))
c = numpy.random.uniform(0., 1.0, n)
P = A.transpose().dot(A)
q = - P.dot(c)
b = A.dot(c)
# solve the problems
M = miqcqp_sdo_relaxation(n, P, q)
Mint = int_least_squares(n, A, b)
M.solve()
Mint.solve()
M.writeTask('M.ptf')
Mint.writeTask('Mint.ptf')
# rounded and optimal solution
xRound = numpy.rint(M.getVariable("Z").slice([0,n], [n,n+1]).level())
xOpt = numpy.rint(Mint.getVariable("x").level())
print(M.getSolverDoubleInfo("optimizerTime"), Mint.getSolverDoubleInfo("optimizerTime"))
print(numpy.linalg.norm(A.dot(xRound)-b), numpy.linalg.norm(A.dot(xOpt)-b))
Experimentally the objective value for xRound
approximates the optimal solution with a factor of