Except you can bound it, and you can escape it that way.
IE "not gotten anywhere after 30 iterations, so don't know".
Otherwise, you couldn't, for example, ever compute the tripcount for a loop without risking running forever.
" Compilers happily cheat on optimizations, register layout, etc."
They don't cheat. They just make the problem easier at a cost of optimality.
Or, they discover they didn't need to solve the problem they thought they did.
A great example is register allocation.
NP-complete to do graph coloring.
NP-complete to decide if a given graph admits a k-coloring for a given k except for the cases k ∈ {0,1,2}.
NP-hard to compute the chromatic number (ie just figuring out the minimum number of colors for a graph)
This was the state, right up until a number of people proved that, if you do it on SSA form (a linear time transformation), you can optimally color it in linear time.
A small difference.
So, often, these upper bounds that seem "futile", maybe are not as futile as one thinks.
Otherwise, you couldn't, for example, ever compute the tripcount for a loop without risking running forever.
" Compilers happily cheat on optimizations, register layout, etc."
They don't cheat. They just make the problem easier at a cost of optimality. Or, they discover they didn't need to solve the problem they thought they did.
A great example is register allocation. NP-complete to do graph coloring.
NP-complete to decide if a given graph admits a k-coloring for a given k except for the cases k ∈ {0,1,2}.
NP-hard to compute the chromatic number (ie just figuring out the minimum number of colors for a graph)
This was the state, right up until a number of people proved that, if you do it on SSA form (a linear time transformation), you can optimally color it in linear time. A small difference.
So, often, these upper bounds that seem "futile", maybe are not as futile as one thinks.