As usual, I think CPUs got "fast enough" for the majority of applications long ago, and it's only software inefficiency that has driven the need for more powerful hardware.
I would say the "long ago" was 13 years ago with the release of the legendary Sandy Bridge line. Those things are still fast enough today that you don't need anything newer for most practical daily tasks.
Sandy Bridge's staying power even became a meme in Japan, with Sandy Bridge users refusing to upgrade due to lack of need becoming known as "Uncle Sandy" (サンディーおじさん, Sandy-ojisan).
At this point the only reasons to upgrade are one of two things: Software legitimately pushing the performance envelope, like vidja gaemz; and software demanding more resources for no good fucking reason.
> Those things are still fast enough today that you don't need anything newer for most practical daily tasks.
A 13 year old CPU should definitely struggle to keep up with just the basic electron based chat applications we use today, which is only one piece of software that most need to use. There is still no incentive for startups/VC funded companies to make their software more performant.
My mom has a i5 2500 ThinkCentre, for her needs that cpu it's good enough, she mostly browse the web and use basic apps like office or Spotify.
I'll say the biggest bottleneck is the old igpu, but I solved that using a cheap 1030. The performance it's a lot better than a current low end intel (with the main drawback being the power draw).
I used a 2700k for a long time - it was a great chip with great longevity. But there are a lot of basic apps today it would struggle with - even firefox with a few extensions would make it sweat. The platform RAM speed was a lot slower too.
I don't think I'd want to run a 2700k today unless there were no other computers available.
I've since upgraded to a 14700K and Windows 11, but I still have my previous 2700K machine around as a sub unit.
Aside from forced Windows 7 obsolescence, it can do practically everything except play the very latest games just fine. If I could have been bothered to downgrade the Windows 7 install to Windows 10 it would have been even more useful to this day.
from the 2700k I went to the 7600k, and now to the AMD 7900x. I sold the 2700k a long time ago, but I still have the 7600k machine somewhere. I should get rid of it.
The Sandy Bridge and Ivy Bridge stuff definitely stayed relevant for a very long time, but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads. You can with some (but not a ton, IMO) justification say that "software is slower now", but also software does more stuff and that's getting to the point where you need to juggle more than a quad-core chip, especially one with one-quarter the single-thread performance (rough estimate, Passmark single-thread between an i7-2600 and an i5-13600), is up for.
There's a glut of Skylakes that can't run Windows 11, however, for cheap, and those certainly have some value on the budget side.
> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads.
So "and software demanding more resources for no good fucking reason.".
You're really just reinforcing the parent's point.
That's silly. Even if you set aside that people want computers to do more things at once, the basic act of decoding 4K video is beyond a Sandy Bridge chip without a discrete GPU. The world has passed it by. Sorry that weirdo tech primordialism doesn't really work, but not that sorry.
> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads.
I dispute this - I had an i5-2500K (which is a 4 core, Sandy Bridge processor) that I upgraded about two years. But I didn't upgrade for performance reasons (I was still gaming on it!), but rather because one of my ram sticks had gotten crashy and I wanted to try building a mITX computer instead. It would definitely still be plenty for a lot of use cases. I still have the 2500k and motherboard sitting in a box, just in case someone desperately needs a computer.
> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users
Software-hardware coevolution in practice.
When the first dual cores appeared, they were dual hardware cores... in a software world that was designed to operating with a time-sliced single core.
It felt like magic!
However, now most pieces of heavy software can use multiple cores, so you need 'one more core' to get the same level of surplus capacity.
Also, the typical load of multiple tabs open with modern js web apps is much heavier than anything that would have been running constantly then.
In the 90s, 00s, you closed down your background programs if you needed to run something important.
You are right. Goldmont Plus isn't really interesting as a product. Even its power efficiency wasn't that good at the time, you were better off with a two core ULV Skylake part.
And so you only really found it in really cheap use cases, and almost everyone ignored it.
But in the context of Gracemont and Crestmont (which are very interesting products, and a massive departure from intel's traditional uArch), Goldmont plus' uArch becomes very interesting, because it's forgotten a stepping stone along the way.
For example, Goldmont Plus has a massive pre decode cache, which kind of makes sense given Intel's plans to add twin decoder clusters to Tremont (and later triple decode clusters to Skymont). Except, Tremont deleted it, I guess that means the idea didn't work out... Again, because this is far from the first time that someone experimented with pre decoding x86.
> a 2c/4t Skylake ULV chip can offer much better single threaded performance
Those Amber Lake Y chips were the bees knees. 5W Core CPUs -- these were the last. Ice Lake and Tiger Lake were 7W at last and then the low power Alder Lake CPUs only contain E cores so they are just renamed Atom CPUs. Which is visible from the N series numbering, too.
I've got a few of these chips (or something very similar, the J4105) in some repurposed Dell Wyze 5070s. They're actually really impressive chips (to me), in that they really sip power and do a lot including web browsing, running a lightweight IDE, basic office suite stuff on Debian, and are really over powered as an appliance (they're popular to convert to run as routers). Honestly I'm really surprised they didn't sell a lot more of these.
The Intel spec for these chips says only 8GB of RAM, but they're dual channel and I find they'll actually run 32GB just fine. The newer N-series processors everyone seems to like
are actually kind of inferior in that regard, since they're single channel and apparently
really only do 16GB of RAM.
I have a Lenovo ideapad with the Pentium Silver 6000 chip ($150 on sale at Costco) and it does everything I need with 4gb RAM running Devuan; someday I will upgrade it to 12gb by installing 8gb in it.
The one discussed in the article has a similar performance with 1/10th the power consumption: https://www.cpubenchmark.net/cpu.php?cpu=Intel+Celeron+J4125...
As usual, I think CPUs got "fast enough" for the majority of applications long ago, and it's only software inefficiency that has driven the need for more powerful hardware.