Everything has a failure rate. This introduces another set of heuristics (and another false-{positive,negative} rate), another set of potential implementation bugs in those heuristics. What if being able to run the program on a particularly memory-limited computer could save the universe, even more slowly than the project manager originally wanted it to run?
Statically encoding something like a memory constraint sounds very hard for all but the simplest algorithms. What about a randomized algorithm that sometimes uses more heap? How do you get around not knowing how much it's going to use without running it?
Statically encoding something like a memory constraint sounds very hard for all but the simplest algorithms. What about a randomized algorithm that sometimes uses more heap? How do you get around not knowing how much it's going to use without running it?