Hacker News new | past | comments | ask | show | jobs | submit login

To quote my CS professor, “You will notice that most of the algorithms we will cover, and most of those on the white book, are so short and simple that they fit on a single page and usually less. The real world code is longer, but there is a world out there of algorithms which probably aren’t this short and simple but we don’t know them yet.” (1995 or thereabouts)

If anything I think he missed a key component which is that algorithmic improvement is actually only interesting in a few very specific places.

I think pdqsort actually shows this to be true - it uses added complexity to improve on things that really do fit in a few lines of pseudocode.




There is definitely a trade-off between brevity and efficiency/practicality. A classic example is "The Fastest and Shortest Algorithm for All Well-Defined Problems":

> An algorithm M is described that solves any well-defined problem p as quickly as the fastest algorithm computing a solution to p, save for a factor of 5 and low-order additive terms

Spoiler alert, the "algorithm M" is a brute-force search; the "low-order additive terms" in its time complexity come from brute-force searching for provably faster programs; if one is found, we restart with that. All the time spent searching for proofs is independent of the input size (e.g. it depends on "sorting", but not which particular list we've been given); hence it's technically just a constant 'warmup time'!

http://hutter1.net/ai/pfastprg.htm




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: