Hacker News new | past | comments | ask | show | jobs | submit login

The viewpoint the author seems to expound reeks of second system syndrome. That is, he seems to argue that incremental evolution is broken, since the state at any given point is suboptimal given what someone building from scratch would build, having all the lessons of hindsight with them.

Of course it is. Yet computer science history is full of overly ambitious projects (Plan9, Vista, Lisp) falling by the wayside as inferior solutions which were more incremental continued to chug along. The author didn't stop to consider why those failed, only to blame a myopic community; as if Lisp being better would have proven itself if only more people gave it a chance.

Incremental changes are how we progress. We take the lessons we have learned from what we're doing now, and then try something a little bit different. We can't try and redesign everything at once, so we pick a few things, and other compromises get left in. Compromise is an essential requirement to getting anything big done.




he seems to argue that incremental evolution is broken

I think he argues that this sort of absolutism is broken.

The author isn't proposing that we discard incremental change, just the uncritical assumption that incremental is the only reasonable change.

See his analogy with portfolio theory: he's not even challenging incrementalism as the default, any more than he'd suggest putting 90% of your money into emerging markets.


It seems more like a historical observation than an uncritical assumption, that incremental change has shown itself to be more successful. The author's claim seems to be that incremental change has been more successful historically because it has more mindshare. I think that causality is backward.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: