Hacker News new | past | comments | ask | show | jobs | submit login
Misunderstanding the Knuth premature optimization quote (bluebytesoftware.com)
34 points by praptak on Sept 8, 2010 | hide | past | favorite | 8 comments



The actual line from Knuth's article "Structured Programming with go to Statements" is:

(...) "premature emphasis on efficiency is a big mistake which may well be the source of most programming complexity and grief."

Note: not "optimization" but "premature emphasis on efficiency" and "may be the source of complexity and grief" not "root of all evil."

Who invented "root of all evil" then?

Edit: here it is, also Knuth:

" There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance a considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3 %. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified."

OK, there it is, but nicely guarded in a long sentence and not made to be used as some mantra!

Additionally, I agree with Joe Duffy:

"It's an all-too-common occurrence. I'll give code review feedback, asking "Why didn't you take approach B? It seems to be just as clear, and yet obviously has superior performance." Again, this is in a circumstance where I believe the difference matters, given the order of magnitude that matters for the code in question. And I'll get a response, "Premature optimization is the root of all evil." At which point I want to smack this certain someone upside the head (...)"


Yea, personally I think it's simply a question of experience. For example lispers prefer the cons+nreverse solution to anything else because it tends to be the fastest way (yes, I've tried to beat it with a "smarter" algo and failed too) while still being clear.

I think the idea of not optimizing for efficiency is a good one, but (like all things) can be taken to extremes. If you consistently choose the slowest way to do something then when you are ready to optimize for performance you'll probably have problems. What do you do if after the bottle necks have been removed, the system is still too slow?


Lol. Service Unavailable...Knuth really was wrong. This server should be premature optimized for hacker news :D .


If you're having a problem with the performance then by definition fixing it isn't premature.


I think he meant that it should have been prematurely optimized so that it could have handled the traffic from hn and there would never have been any problems with performance.


Unfortunately, many programmers take the opposite tactic: They shove optimizations in apriori of any testing or data, just because "they know it's faster", or "you should never do X". The result is often convoluted code. The Pareto Principle + Knuth's axiom (as shown by acqq) is the best route through such minefields, IMO.

You may not be one of the "many programmers" but I certainly have worked with enough of them over the years to know just how pervasive they tend to be.


it is a typical case of ideology (ie. following principles) approach clashing with the analytical reasoning over real facts/measurements/observations. People everywhere, be it science, programming, politics, religion, ... like to hide in the safe haven of dogmas. Whenever expressing of doubt is prohibited or frown upon - you immediately know what it is a case of ideology/dogma.


I've never seen the whole quote; that does make a lot more sense than the fragment we usually see.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: