Sometimes you only need an approximation. I read a paper once about using TSP approximations for data recovery.
For any pair of picture data fragments, you'd calculate a goodness of fit score to placing these next to each other. E.g. a JPG of an apple would fit badly with a JPG of a car because they don't share any edges, but two consecutive car JPG fragments would fit well.
Then the problem of ordering the data fragments so that each fragment has a high goodness-of-fit (low distance) to the previous one, is a traveling salesman problem.
The paper then went on to say that nobody can exactly solve TSP instances of the size required, but gave examples where the spanning tree heuristic worked pretty well.
Interesting. Though when talking about how to solve combinatorial problems like that, a much more common approach seems to be to formulate them as an integer linear programming problem.
Integer linear programming is also in NP in general. (It has to be, since you can solve the TSP with integer linear programming.) But it's usually much easier to 'write' a linear programme than to reduce a combinatorial problem to a TSP.
Linear programming solvers are very advanced these days.
For any pair of picture data fragments, you'd calculate a goodness of fit score to placing these next to each other. E.g. a JPG of an apple would fit badly with a JPG of a car because they don't share any edges, but two consecutive car JPG fragments would fit well.
Then the problem of ordering the data fragments so that each fragment has a high goodness-of-fit (low distance) to the previous one, is a traveling salesman problem.
The paper then went on to say that nobody can exactly solve TSP instances of the size required, but gave examples where the spanning tree heuristic worked pretty well.