Hacker News new | past | comments | ask | show | jobs | submit login

>(And I don't understand how the reduction to traveling salesman problem would help here?)

Let the similarity between line i when rotated to position k and line i+1 when rotated to position j be d_k,j. Then you want to find a sequence of rotations r_1..r_n so that

d_1,r_1,r_2 + d_2,r_2,r_3 + ... + d_(n-1),r_(n-1),r_n

is minimized. In plain terms, you want to align each line so that it's most alike the previous line you rotated, but that in turn depends on the next-to-previous line you rotated, and so on up.

This is a restricted type of traveling salesman problem: you want to visit all the nodes corresponding to each adjacent pair of rotation offsets so that the total edge cost (distance between the lines) is minimized.

Since it's 1D, it should be pretty easy to solve with some dynamic programming (straightforward memoization). Calculate the minimal cost for each line down to k. Then for every possible choice of rotating the (k+1)th line, check which order of rotations of the preceding k lines that would minimize the objective down to the (k+1)th line, given that you rotated the k+1th line in that particular manner. Then determine the minima for every k+1 position and go on to line k+2.

EDIT: It's even easier, since you can hold one line constant and rotate the next. Whenever you rotate the first line k, if you know the second line's optimal rotation for a zero rotation of the first line is j, you just rotate the second line k+j. So none of the above complexity is actually needed. I'll keep the comment to show how it could be considered a restricted TSP.




Thanks!

Yes, you can always reduce problems in P or in NP to a TSP. (In this case, I didn't see the reduction immediately, hence my question.)

But a reduction to the general TSP is basically never useful, since general TSP is so hard to solve. (Perhaps the reduction is useful, when you don't know whether your problem is even in NP or perhaps more complicated, yet.)


Sometimes you only need an approximation. I read a paper once about using TSP approximations for data recovery.

For any pair of picture data fragments, you'd calculate a goodness of fit score to placing these next to each other. E.g. a JPG of an apple would fit badly with a JPG of a car because they don't share any edges, but two consecutive car JPG fragments would fit well.

Then the problem of ordering the data fragments so that each fragment has a high goodness-of-fit (low distance) to the previous one, is a traveling salesman problem.

The paper then went on to say that nobody can exactly solve TSP instances of the size required, but gave examples where the spanning tree heuristic worked pretty well.


Interesting. Though when talking about how to solve combinatorial problems like that, a much more common approach seems to be to formulate them as an integer linear programming problem.

Integer linear programming is also in NP in general. (It has to be, since you can solve the TSP with integer linear programming.) But it's usually much easier to 'write' a linear programme than to reduce a combinatorial problem to a TSP.

Linear programming solvers are very advanced these days.

The other general workhorse I know of for these kinds of problems are SMT solvers. (https://en.wikipedia.org/wiki/Satisfiability_modulo_theories)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: