# 11.8 Travelling Salesman Problem (TSP)¶

The *Travelling Salesman Problem* is one of the most famous and studied problems in combinatorics and integer optimization. In this case study we shall:

- show how to compactly define a model with
*Fusion*; - implement an iterative algorithm that solves a sequence of optimization problems;
- modify an optimization problem by adding more constraints;
- show how to access the solution of an optimization problem.

The material presented in this section draws inspiration from [Pat03].

In a TSP instance we are given a directed graph \(G=(N,A)\), where \(N\) is the set of nodes and \(A\) is the set of arcs. To each arc \((i,j)\in A\) corresponds a nonnegative cost \(c_{ij}\). The goal is to find a minimum cost *Hamilton cycle* in \(G\), that is a closed tour passing through each node exactly once. For example, consider the small directed graph in Fig. 11.9.

Its corresponding adjacency and cost matrices \(A\) and \(c\) are:

Typically, the problem is modeled introducing a set of binary variables \(x_{ij}\) such that

Now we can introduce the following simple model:

It describes the constraint that every vertex has exactly one incoming and one outgoing arc in the tour, and that only arcs present in the graph can be chosen. Problem (11.31) can be easily implemented in *Fusion*:

```
with Model() as M:
M.setLogHandler(sys.stdout)
x = M.variable([n,n], Domain.binary())
M.constraint(Expr.sum(x,0), Domain.equalsTo(1.0))
M.constraint(Expr.sum(x,1), Domain.equalsTo(1.0))
M.constraint(x, Domain.lessThan( A ))
M.objective(ObjectiveSense.Minimize, Expr.dot(C ,x))
```

Note in particular how:

- we can sum over rows and/or columns using the
`Expr.sum`

function; - we use
`Expr.dot`

to compute the objective function.

The solution to problem (11.31) is not necessarily a closed tour. In fact (11.31) models another problem known as *minimum cost cycle cover*, whose solution may consist of more than one cycle. In our example we get the solution depicted in Fig. 11.9, i.e. there are two loops, namely 0->3->0 and 1->2->1.

A solution to (11.31) solves the TSP problem if and only if it consists of a single cycle. One classical approach ensuring this is the so-called *subtour elimination*: once we found a solution of (11.31) composed of at least two cycles, we add constraints that explicitly avoid that particular solution:

Thus the problem we want to solve at each step is

where \(C\) is the set of cycles in all the cycle covers we have seen so far. The overall solution scheme is the following:

- set \(C\) as the empty set,
- solve problem (11.33),
**if**\(x\) has only one cycle**stop**,**else**add the cycles of \(x\) to \(C\) and**goto**2.

Cycle detection is a fairly easy task and we omit the procedure here for the sake of simplicity. Now we show how to add a constraint for each cycle. Since we have the list of arcs, and each one corresponds to a variable \(x_{ij}\), we can use the function `Variable.pick`

to compactly define constraints of the form (11.32):

```
for c in cycles:
M.constraint(Expr.sum(x.pick(c)), Domain.lessThan( 1.0 * len(c) - 1 ))
```

Executing our procedure will yield the following output:

```
it #1 - solution cost: 2.200000
cycles:
[0,3] - [3,0] -
[1,2] - [2,1] -
it #2 - solution cost: 4.000000
cycles:
[0,1] - [1,2] - [2,3] - [3,0] -
solution:
0 1 0 0
0 0 1 0
0 0 0 1
1 0 0 0
```

Thus we first discover the two-cycle solution; then the second iteration is forced not to include those cycles, and a new solution is located. This time it consists of one loop, and as expected the cost is higher. The solution is depicted in Fig. 11.9.

Formulation (11.33) can be improved in some cases by exploiting the graph structure. Some simple tricks follow.

Self-loops

Self-loops are never part of a TSP tour. Typically self-loops are removed by penalizing them with a huge cost \(c_{ii}\). Although this works in practice, it is more advisable to just fix the corresponding variables to zero, i.e.

This removes redundant variables, and avoids unnecessarily large coefficients that can negatively affect the solver.

Constraints (11.34) are easily implemented as follows:

```
M.constraint(x.diag(), Domain.equalsTo(0.))
```

Two-arc loops removal

In networks with more than two nodes two-loop arcs can also be ignored. They are simple to detect and their number is of the same order as the size of the graph. The constraints we need to add are:

Constraints (11.35) are easily implemented as follows:

```
M.constraint(Expr.add(x, x.transpose()), Domain.lessThan(1.0))
```

The complete working example

```
def tsp(n, A, C, remove_selfloops, remove_2_hop_loops):
with Model() as M:
M.setLogHandler(sys.stdout)
x = M.variable([n,n], Domain.binary())
M.constraint(Expr.sum(x,0), Domain.equalsTo(1.0))
M.constraint(Expr.sum(x,1), Domain.equalsTo(1.0))
M.constraint(x, Domain.lessThan( A ))
M.objective(ObjectiveSense.Minimize, Expr.dot(C ,x))
if remove_2_hop_loops:
M.constraint(Expr.add(x, x.transpose()), Domain.lessThan(1.0))
if remove_selfloops:
M.constraint(x.diag(), Domain.equalsTo(0.))
it = 1
M.writeTask("tsp-0-%s-%s.ptf" % ('t' if remove_selfloops else 'f', 't' if remove_2_hop_loops else 'f'))
while True:
print("\n\n--------------------\nIteration",it)
M.solve()
print('\nsolution cost:', M.primalObjValue())
print('\nsolution:')
cycles = []
for i in range(n):
xi = x.slice([i,0],[i+1,n])
print(xi.level())
for j in range(n):
if xi.level()[j] <= 0.5 : continue
found = False
for c in cycles:
if len( [ a for a in c if i in a or j in a ] )> 0:
c.append( [i,j] )
found = True
break
if not found:
cycles.append([ [ i,j ]])
print('\ncycles:')
print([c for c in cycles])
if len(cycles)==1:
break;
for c in cycles:
M.constraint(Expr.sum(x.pick(c)), Domain.lessThan( 1.0 * len(c) - 1 ))
it = it +1
return x.level(), c
return [],[]
def main():
A_i = [0,1,2,3,1,0,2,0]
A_j = [1,2,3,0,0,2,1,3]
C_v = [1.,1.,1.,1.,0.1,0.1,0.1,0.1]
n = max(max(A_i),max(A_j))+1
costs = Matrix.sparse(n,n,A_i,A_j,C_v)
x,c = tsp(n, Matrix.sparse(n,n,A_i,A_j,1.), costs , True, True)
x,c = tsp(n, Matrix.sparse(n,n,A_i,A_j,1.), costs , True, False)
```