Every once in a while I see an article that seems to claim that Moore's Law is over, or slowing down, or about to be over. Then I see some counter-claim, that no, if you account for added cores, or GPUs, or some other third thing, that actually it's still right on track. This cycle has repeated every year for like the past 10 years, but the last few years feel like things have really started to slow down. Maybe that was partially illusory with the chip slowdown from the pandemic, but I figure now that we're several years out we should be able to say for sure.
It also seems like a pretty important question to answer because it has big implications for the advancement of AI technology which has everyone so freaked out.
So what's the consensus around here? Is Moore's Law actually over yet, or not?
This graph to me show that while yes technically Moore's law of doubling transistor per "thing Intel or AMD sells you" is still holding, it has ended for single threaded workloads. Moore's law is only holding due to core count increase.
For everyday use of users running multiple simple programs/apps, that's fine. But for truly compute heavy workloads (think a CAD software or any other heavy processing), developers turned to GPUs to get the compute power improvements.
Writing amazing programs taking full advantage of the core count increase is simply impossible (see Amdahl's law). So even if one wanted to rearchitect programs to take full advantage of the overall transistor count from ~2005 to now, they won't be able to.
Compare with pre-2005, where one just had to sit & wait too see their CPU-heavy workloads improve... It's definitely a different era of compute improvements