The Sandy Bridge and Ivy Bridge stuff definitely stayed relevant for a very long time, but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads. You can with some (but not a ton, IMO) justification say that "software is slower now", but also software does more stuff and that's getting to the point where you need to juggle more than a quad-core chip, especially one with one-quarter the single-thread performance (rough estimate, Passmark single-thread between an i7-2600 and an i5-13600), is up for.
There's a glut of Skylakes that can't run Windows 11, however, for cheap, and those certainly have some value on the budget side.
> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads.
So "and software demanding more resources for no good fucking reason.".
You're really just reinforcing the parent's point.
That's silly. Even if you set aside that people want computers to do more things at once, the basic act of decoding 4K video is beyond a Sandy Bridge chip without a discrete GPU. The world has passed it by. Sorry that weirdo tech primordialism doesn't really work, but not that sorry.
> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads.
I dispute this - I had an i5-2500K (which is a 4 core, Sandy Bridge processor) that I upgraded about two years. But I didn't upgrade for performance reasons (I was still gaming on it!), but rather because one of my ram sticks had gotten crashy and I wanted to try building a mITX computer instead. It would definitely still be plenty for a lot of use cases. I still have the 2500k and motherboard sitting in a box, just in case someone desperately needs a computer.
> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users
Software-hardware coevolution in practice.
When the first dual cores appeared, they were dual hardware cores... in a software world that was designed to operating with a time-sliced single core.
It felt like magic!
However, now most pieces of heavy software can use multiple cores, so you need 'one more core' to get the same level of surplus capacity.
Also, the typical load of multiple tabs open with modern js web apps is much heavier than anything that would have been running constantly then.
In the 90s, 00s, you closed down your background programs if you needed to run something important.
There's a glut of Skylakes that can't run Windows 11, however, for cheap, and those certainly have some value on the budget side.