Hacker News new | past | comments | ask | show | jobs | submit login

People still haven't completely worked through the fact that Moore's Law has died. I think the anger phase was the late 2000's, when quantum computing, etc. was trotted out to denounce anyone noticing the slowdown.

This looks like bargaining to me.




Nah, you're not taking in account Moore's Meta Law:

"Whenever Moore's law is no longer applicable, its definition is changed to accommodate some new version of 'computers get faster'."

Moore's Law was dead a long long time ago.


Completely agree on both counts. There are seven stages of grief, so these things take time.


Classically there are 5 (denial, anger, bargaining, depression, acceptance)


That's the old version. Intel added two phases.


It seems to me that there is still some progress to be made in Moore's Law. Maybe not directly by shrinking the manufacturing process, but:

AlphaGo requires huge amount of energy and CPU power to accomplish what human brain does in 20W, just using a smallish portion of its capabilities. There must be a plenty of undiscovered architectural improvements to computers that can still kick the Moore Law for a couple years.


Moore's law is about transistor density, and thus is oblivious to architecture.


And that's exactly what the parent commenter is saying. Moore's law actually wasn't about transistor density in the beginning, it was changed to this definition when they realized transistor density is the only thing increasing exponentially.


Well, Moore's law is stupid however it is defined, and I do also hope for big benefits from architecture (interface and implementation) work. So I guess I shouldn't be picking an argument :).

[As an aside, I had thought that Moore's law was originally transistor density, and the hype machine spun it into performance. But like I said, Moore's law is doomed regardless, so whatever.]


GPUs are keeping Moore alive




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: