# 7.9 Model Parametrization and Reoptimization¶

This tutorial demonstrates how to construct a model with a fixed structure and reoptimize it by changing some of the input data. If you instead want to dynamically modify the model structure between optimizations by adding variables, constraints etc., see the other reoptimization tutorial Sec. 7.10 (Problem Modification and Reoptimization).

For this tutorial we solve the following variant of *linear regression with elastic net regularization*:

where \(A\in\real^{m\times n}\), \(b\in\real^{m}\). The optimization variable is \(x\in\real^n\) and \(\lambda_1,\lambda_2\) are two nonnegative numbers indicating the tradeoff between the linear regression objective, a *lasso* (\(\ell_1\)-norm) penalty and a *ridge* (\(\ell_2\)-norm) regularization. The representation of this problem compatible with **MOSEK** input format is

## 7.9.1 Creating a model¶

Before creating a parametrized model we should analyze which parts of the model are fixed once for all, and which parts do we intend to change between optimizations. Here we make the following assumption:

the matrix \(A\) will not change,

we want to solve the problem for many target vectors \(b\),

we want to experiment with different tradeoffs \(\lambda_1, \lambda_2\).

That leads us to construct the model with \(A\) provided from the start as fixed input and declare \(b,\lambda_1,\lambda_2\) as parameters. The initial model construction is shown below. Parameters are objects of type `Parameter`

, created with the method `Model.parameter`

. We exploit the fact that parameters can have shapes, just like variables and expressions, and that they can be used everywhere within an expression where a constant of the same shape would be suitable.

```
def initializeModel(m, n, A):
M = Model()
x = M.variable("x", n)
# t >= |Ax-b|_2 where b is a parameter
b = M.parameter("b", m)
t = M.variable()
M.constraint(Expr.vstack(t, A@x-b) == Domain.inQCone())
# p_i >= |x_i|, i=1..n
p = M.variable(n)
M.constraint(Expr.hstack(p, x) == Domain.inQCone())
# q >= |x|_2
q = M.variable()
M.constraint(Expr.vstack(q, x) == Domain.inQCone())
# Objective, parametrized with lambda1, lambda2
# t + lambda1*sum(p) + lambda2*q
lambda1 = M.parameter("lambda1")
lambda2 = M.parameter("lambda2")
obj = t + lambda1 * Expr.sum(p) + lambda2 * q
M.objective(ObjectiveSense.Minimize, obj)
# Return the ready model
return M
```

For the purpose of the example we take

and we initialize the parametrized model:

```
# Create a small example
m, n = 4, 2
A = [ [1.0, 2.0],
[3.0, 4.0],
[-2.0, -1.0],
[-4.0, -3.0] ]
M = initializeModel(m, n, A)
# For convenience retrieve some elements of the model
b = M.getParameter("b")
lambda1 = M.getParameter("lambda1")
lambda2 = M.getParameter("lambda2")
x = M.getVariable("x")
```

We made sure to keep references to the interesting elements of the model, in particular the parameter objects we are about to set values of.

## 7.9.2 Setting parameters¶

For the first solve we use

Parameters are set with method `Parameter.setValue`

. We set the parameters and solve the model as follows:

```
# First solve
b.setValue([0.1, 1.2, -1.1, 3.0])
lambda1.setValue(0.1)
lambda2.setValue(0.01)
M.solve()
print("Objective {0}, solution x = {1}".format(M.primalObjValue(), x.level()))
```

## 7.9.3 Changing parameters¶

Let us say we now want to increase the weight of the lasso penalty in order to favor sparser solutions. We can simply change that parameter, leave the other ones unchanged, and resolve:

```
# Increase lambda1
lambda1.setValue(0.5)
M.solve()
print("Objective {0}, solution x = {1}".format(M.primalObjValue(), x.level()))
```

Next, we might want to solve a few instances of the problem for another value of \(b\). Again, we reset the relevant parameters and solve:

```
# Now change the data completely
b.setValue([1.0, 1.0, 1.0, 1.0])
lambda1.setValue(0.0)
lambda2.setValue(0.0)
M.solve()
print("Objective {0}, solution x = {1}".format(M.primalObjValue(), x.level()))
# And increase lamda2
lambda2.setValue(1.4145)
M.solve()
print("Objective {0}, solution x = {1}".format(M.primalObjValue(), x.level()))
```

## 7.9.4 Additional remarks¶

Domains cannot be parametrized, therefore to parametrize a bound, such as \(x\geq p\), it is necessary to write it as \(x-p\geq 0\).

Coefficients appearing at semidefinite terms cannot be parametrized. If it is necessary to have a parametrized expression such as \(p\barX_{i,j}\), introduce an auxiliary scalar variable \(x=\barX_{i,j}\) and use \(px\) in the model.

Parametrized models can be found in the following examples:

`alan.py`

,`portfolio_2_frontier.py`

,`portfolio_5_card.py`

,`total_variation.py`

.