simulated_annealing_tsp(G, init_cycle, weight='weight', source=None, temp=100, move='1-1', max_iterations=10, N_inner=100, alpha=0.01, seed=None)
This function uses simulated annealing to approximate the minimal cost cycle through the nodes. Starting from a suboptimal solution, simulated annealing perturbs that solution, occasionally accepting changes that make the solution worse to escape from a locally optimal solution. The chance of accepting such changes decreases over the iterations to encourage an optimal result. In summary, the function returns a cycle starting at :None:None:`source`
for which the total cost is minimized. It also returns the cost.
The chance of accepting a proposed change is related to a parameter called the temperature (annealing has a physical analogue of steel hardening as it cools). As the temperature is reduced, the chance of moves that increase cost goes down.
Simulated Annealing is a metaheuristic local search algorithm. The main characteristic of this algorithm is that it accepts even solutions which lead to the increase of the cost in order to escape from low quality local optimal solutions.
This algorithm needs an initial solution. If not provided, it is constructed by a simple greedy algorithm. At every iteration, the algorithm selects thoughtfully a neighbor solution. Consider $c(x)$ cost of current solution and $c(x')$ cost of a neighbor solution. If $c(x') - c(x) <= 0$ then the neighbor solution becomes the current solution for the next iteration. Otherwise, the algorithm accepts the neighbor solution with probability $p = exp - ([c(x') - c(x)] / temp)$. Otherwise the current solution is retained.
:None:None:`temp`
is a parameter of the algorithm and represents temperature.
Time complexity: For $N_i$ iterations of the inner loop and $N_o$ iterations of the outer loop, this algorithm has running time $O(N_i * N_o *
<SubstitutionRef: |value: '|V|' |>)$.
For more information and how the algorithm is inspired see: http://en.wikipedia.org/wiki/Simulated_annealing
G
should be a complete weighted undirected graph. The distance between all pairs of nodes should be included.
The initial solution (a cycle through all nodes returning to the start). This argument has no default to make you think about it. If "greedy", use :None:None:`greedy_tsp(G, weight)`
. Other common starting cycles are :None:None:`list(G) + [next(iter(G))]`
or the final result of simulated_annealing_tsp
when doing threshold_accepting_tsp
.
Edge data key corresponding to the edge weight. If any edge does not have this attribute the weight is set to 1.
Starting node. If None, defaults to next(iter(G))
The algorithm's temperature parameter. It represents the initial value of temperature
Indicator of what move to use when finding new trial solutions. Strings indicate two special built-in moves:
"1-1": 1-1 exchange which transposes the position of two elements of the current solution. The function called is swap_two_nodes
. For example if we apply 1-1 exchange in the solution A = [3, 2, 1, 4, 3]
we can get the following by the transposition of 1 and 4 elements: A' = [3, 2, 4, 1, 3]
"1-0": 1-0 exchange which moves an node in the solution to a new position. The function called is move_one_node
. For example if we apply 1-0 exchange in the solution A = [3, 2, 1, 4, 3]
we can transfer the fourth element to the second position: A' = [3, 4, 2, 1, 3]
You may provide your own functions to enact a move from one solution to a neighbor solution. The function must take the solution as input along with a :None:None:`seed`
input to control random number generation (see the :None:None:`seed`
input here). Your function should maintain the solution as a cycle with equal first and last node and all others appearing once. Your function should return the new solution.
Declared done when this number of consecutive iterations of the outer loop occurs without any change in the best cost solution.
The number of iterations of the inner loop.
Percentage of temperature decrease in each iteration of outer loop
Indicator of random number generation state. See Randomness<randomness>
.
If G
is not complete the algorithm raises an exception.
Returns the cycle (list of nodes) that a salesman can follow to minimize total weight of the trip.
Returns an approximate solution to the traveling salesman problem.
>>> from networkx.algorithms import approximation as approx
... G = nx.DiGraph()
... G.add_weighted_edges_from({
... ("A", "B", 3), ("A", "C", 17), ("A", "D", 14), ("B", "A", 3),
... ("B", "C", 12), ("B", "D", 16), ("C", "A", 13),("C", "B", 12),
... ("C", "D", 4), ("D", "A", 14), ("D", "B", 15), ("D", "C", 2)
... })
... cycle = approx.simulated_annealing_tsp(G, "greedy", source="D")
... cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
... cycle ['D', 'C', 'B', 'A', 'D']
>>> cost 31
>>> incycle = ["D", "B", "A", "C", "D"]
... cycle = approx.simulated_annealing_tsp(G, incycle, source="D")
... cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
... cycle ['D', 'C', 'B', 'A', 'D']
>>> cost 31See :
The following pages refer to to this document either explicitly or contain code examples using this.
networkx.algorithms.approximation.traveling_salesman.simulated_annealing_tsp
networkx.algorithms.approximation.traveling_salesman.traveling_salesman_problem
networkx.algorithms.approximation.traveling_salesman.threshold_accepting_tsp
Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.
Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)
SVG is more flexible but power hungry; and does not scale well to 50 + nodes.
All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them