Hacker News new | past | comments | ask | show | jobs | submit login
Progress in Algorithms Beats Moore’s Law (agtb.wordpress.com)
50 points by robg on Dec 26, 2010 | hide | past | favorite | 15 comments



Title should be: Progress in Some Algorithms Beats Moore's law

which is not surprising


Perhaps Moore's law is a consequence of people's ability to design algorithms. Hardware design needs similar resources and concepts: circuit routing, simulations, benchmarking, design re-use, code bases ( http://en.wikipedia.org/wiki/Hardware_description_language )...


Some discussion earlier today:

http://news.ycombinator.com/item?id=2038726


This is a gratifying idea to think about. It implies that the "underoptimized software" woes are not really a problem of increased hardware causing laziness - they're a result of combined hardware and algorithm developments making problems _possible_ to solve that previously would have been too hard. That they could be solved with tighter code is ultimately less important.


Why is this surprising?

First of all, Moore's Law isn't a law at all, it's a prediction. It's not some universal constraint.

Second, it's a self fulfilling prophecy, which explains why the prediction has been pretty darn good so far:

http://en.wikipedia.org/wiki/Moore%27s_law#As_a_target_for_i...


I'm basically going to abuse this article to make the claim Computer Science > Computer Engineering (by a factor of 43!).


43-factorial is still a constant, so O(0) as far as complexity is concerned.


You meant O(1) perhaps.


O(0) ?


Self driving cars offer, perhaps, a counter example:

It was a fundamental problem. In the mid-'90s, microchips weren't fast enough to process all the potential options, especially not at 55 miles per hour. In 1996, Dickmanns proclaimed that real-world autonomous driving could "only be realized with the increase in computer performance … With Moore's law still valid, this means a time period of more than one decade." He was right, and everyone knew it. Research funding dried up, programs shut down, and autonomous driving receded back to the future.

from:

http://www.wired.com/wired/archive/14.01/stanley.html?pg=2&#...


He was right, and everyone knew it. Research funding dried up, programs shut down, and autonomous driving receded back to the future.

It seems perhaps that this was a self fulfilling prophecy as we'll never know how much closer we could be now to self-driving cars if funding had not dried up.


We have self driving cars right now, see for example DARPA's urban challenge. We only need to increase reliability and decrease costs to make them suitable for the market. There are also some legal problems (e.g. who is responsible if the robot car kills a kitten), but technologically self driving cars are a reality.


But which came first? The receding or the drying?


That point is perhaps a bad example because of reliance in other kinds of technology like LIDAR. Innovation in sensors is a confounding variable in addition to algorithms and processor speeds.


Related: DARPA Urban Challenge http://www.darpa.mil/grandchallenge/index.asp




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: