# 11.8 Travelling Salesman Problem (TSP)¶

The Travelling Salesman Problem is one of the most famous and studied problems in combinatorics and integer optimization. In this case study we shall:

• show how to compactly define a model with Fusion;
• implement an iterative algorithm that solves a sequence of optimization problems;
• modify an optimization problem by adding more constraints;
• show how to access the solution of an optimization problem.

The material presented in this section draws inspiration from [Pat03].

In a TSP instance we are given a directed graph $$G=(N,A)$$, where $$N$$ is the set of nodes and $$A$$ is the set of arcs. To each arc $$(i,j)\in A$$ corresponds a nonnegative cost $$c_{ij}$$. The goal is to find a minimum cost Hamilton cycle in $$G$$, that is a closed tour passing through each node exactly once. For example, consider the small directed graph in Fig. 11.9.

Fig. 11.9 (Left) a directed graph with costs. (Middle) The minimum cycle cover found in the first iteration. (Right) The minimum cost travelling salesman tour.

Its corresponding adjacency and cost matrices $$A$$ and $$c$$ are:

$\begin{split}\begin{array}{cc} A= \left[ \begin{array}{cccc} 0 & 1 & 1 & 1\\ 1 & 0 & 1 & 0\\ 0 & 1 & 0 & 1\\ 1 & 0 & 0 & 0 \end{array} \right], & c= \left[ \begin{array}{cccc} 0 & 1 & 0.1 & 0.1\\ 0.1 & 0 & 1 & 0\\ 0 & 0.1 & 0 & 1\\ 1 & 0 & 0 & 0 \end{array} \right]. \end{array}\end{split}$

Typically, the problem is modeled introducing a set of binary variables $$x_{ij}$$ such that

$\begin{split}x_{ij} = \left\lbrace \begin{array}{cl} 0 & \text{if arc } (i,j) \text{ is in the tour}, \\ 1 & \text{otherwise}. \end{array}\right.\end{split}$

Now we can introduce the following simple model:

(11.31)$\begin{split}\begin{array}{lll} \min & \sum_{i,j} c_{ij}x_{ij} &\\ \st & \sum_i x_{ij} = 1 & \forall j=1,\ldots,n, \\ & \sum_j x_{ij} = 1 & \forall i=1,\ldots,n, \\ & x_{ij}\leq A_{ij} & \forall i,j,\\ & x_{ij}\in \{0,1\} & \forall i,j. \end{array}\end{split}$

It describes the constraint that every vertex has exactly one incoming and one outgoing arc in the tour, and that only arcs present in the graph can be chosen. Problem (11.31) can be easily implemented in Fusion:

    with Model() as M:

M.setLogHandler(sys.stdout)
x = M.variable([n,n], Domain.binary())

M.constraint(Expr.sum(x,0), Domain.equalsTo(1.0))
M.constraint(Expr.sum(x,1), Domain.equalsTo(1.0))
M.constraint(x, Domain.lessThan( A ))

M.objective(ObjectiveSense.Minimize, Expr.dot(C ,x))


Note in particular how:

The solution to problem (11.31) is not necessarily a closed tour. In fact (11.31) models another problem known as minimum cost cycle cover, whose solution may consist of more than one cycle. In our example we get the solution depicted in Fig. 11.9, i.e. there are two loops, namely 0->3->0 and 1->2->1.

A solution to (11.31) solves the TSP problem if and only if it consists of a single cycle. One classical approach ensuring this is the so-called subtour elimination: once we found a solution of (11.31) composed of at least two cycles, we add constraints that explicitly avoid that particular solution:

(11.32)$\sum_{(i,j)\in c} x_{ij} \leq |c| -1 \quad \forall c\in C.$

Thus the problem we want to solve at each step is

(11.33)$\begin{split}\begin{array}{lll} \min & \sum_{i,j} c_{ij}x_{ij} &\\ \st & \sum_i x_{ij} = 1 & \forall j=1,\ldots,n, \\ & \sum_j x_{ij} = 1 & \forall i=1,\ldots,n, \\ & x_{ij}\leq A_{ij} & \forall i,j,\\ & x_{ij}\in \{0,1\} & \forall i,j,\\ & \sum_{(i,j)\in c} x_{ij} \leq |c| -1 & \forall c\in C, \end{array}\end{split}$

where $$C$$ is the set of cycles in all the cycle covers we have seen so far. The overall solution scheme is the following:

1. set $$C$$ as the empty set,
2. solve problem (11.33),
3. if $$x$$ has only one cycle stop,
4. else add the cycles of $$x$$ to $$C$$ and goto 2.

Cycle detection is a fairly easy task and we omit the procedure here for the sake of simplicity. Now we show how to add a constraint for each cycle. Since we have the list of arcs, and each one corresponds to a variable $$x_{ij}$$, we can use the function Variable.pick to compactly define constraints of the form (11.32):

            for c in cycles:
M.constraint(Expr.sum(x.pick(c)), Domain.lessThan( 1.0 * len(c) - 1 ))


Executing our procedure will yield the following output:

it #1 - solution cost: 2.200000

cycles:
[0,3] - [3,0] -
[1,2] - [2,1] -

it #2 - solution cost: 4.000000

cycles:
[0,1] - [1,2] - [2,3] - [3,0] -

solution:
0  1  0  0
0  0  1  0
0  0  0  1
1  0  0  0


Thus we first discover the two-cycle solution; then the second iteration is forced not to include those cycles, and a new solution is located. This time it consists of one loop, and as expected the cost is higher. The solution is depicted in Fig. 11.9.

Formulation (11.33) can be improved in some cases by exploiting the graph structure. Some simple tricks follow.

Self-loops

Self-loops are never part of a TSP tour. Typically self-loops are removed by penalizing them with a huge cost $$c_{ii}$$. Although this works in practice, it is more advisable to just fix the corresponding variables to zero, i.e.

(11.34)$x_{ii} = 0\quad \forall i=1,\ldots,n.$

This removes redundant variables, and avoids unnecessarily large coefficients that can negatively affect the solver.

Constraints (11.34) are easily implemented as follows:

            M.constraint(x.diag(), Domain.equalsTo(0.))


Two-arc loops removal

In networks with more than two nodes two-loop arcs can also be ignored. They are simple to detect and their number is of the same order as the size of the graph. The constraints we need to add are:

(11.35)$x_{ij} + x_{ji} \leq 1 \quad \forall i,j=1,\ldots,n.$

Constraints (11.35) are easily implemented as follows:

            M.constraint(Expr.add(x, x.transpose()), Domain.lessThan(1.0))


The complete working example

Listing 11.16 The complete code for the TSP examples. Click here to download.
def tsp(n, A, C, remove_selfloops, remove_2_hop_loops):
with Model() as M:

M.setLogHandler(sys.stdout)
x = M.variable([n,n], Domain.binary())

M.constraint(Expr.sum(x,0), Domain.equalsTo(1.0))
M.constraint(Expr.sum(x,1), Domain.equalsTo(1.0))
M.constraint(x, Domain.lessThan( A ))

M.objective(ObjectiveSense.Minimize, Expr.dot(C ,x))

if remove_2_hop_loops:

if remove_selfloops:
M.constraint(x.diag(), Domain.equalsTo(0.))

it = 1
M.writeTask("tsp-0-%s-%s.ptf" % ('t' if remove_selfloops else 'f', 't' if remove_2_hop_loops else 'f'))

while True:
print("\n\n--------------------\nIteration",it)
M.solve()

print('\nsolution cost:', M.primalObjValue())
print('\nsolution:')

cycles = []

for i in range(n):
xi = x.slice([i,0],[i+1,n])
print(xi.level())

for j in range(n):
if xi.level()[j] <= 0.5 : continue

found = False
for c in cycles:
if len( [ a for a in c if i in a or j in a ] )> 0:
c.append( [i,j] )
found = True
break

cycles.append([ [ i,j ]])

print('\ncycles:')
print([c for c in cycles])

if len(cycles)==1:
break;

for c in cycles:
M.constraint(Expr.sum(x.pick(c)), Domain.lessThan( 1.0 * len(c) - 1 ))
it = it +1

return x.level(), c

return [],[]

def main():
A_i = [0,1,2,3,1,0,2,0]
A_j = [1,2,3,0,0,2,1,3]
C_v = [1.,1.,1.,1.,0.1,0.1,0.1,0.1]
n = max(max(A_i),max(A_j))+1
costs = Matrix.sparse(n,n,A_i,A_j,C_v)
x,c = tsp(n, Matrix.sparse(n,n,A_i,A_j,1.), costs , True, True)
x,c = tsp(n, Matrix.sparse(n,n,A_i,A_j,1.), costs , True, False)