My entire career, we never optimize code as well as we can, we optimize as well as we need to. Obviously the result is that computer performance is only "just okay" despite the hardware being capable of much more. This pattern repeats itself across the industry over decades without changing much.
The problem is that performance for most common tasks that people do (f.e. browsing the web, opening a word processor, hell even opening an IM app) has gone from "just okay" to "bad" over the past couple of decades despite our computers getting many times more powerful across every possible dimension (from instructions-per-clock to clock-rate to cache-size to memory-speed to memory-size to ...)
For all this decreased performance, what new features do we have to show for it? Oh great, I can search my Start menu and my taskbar had a shiny gradient for a decade.
I think a lot of this is actually somewhat misremembering how slow computers used to be. We used to use spinning hard disks, and we were so often waiting for them to open programs.
Thinking about it some more, the iPhone and iPad actually comes to mind as devices that perform well and are practically always snappy.
> I think a lot of this is actually somewhat misremembering how slow computers used to be
Suffice to say: I wish. I have a decently powerful computer now, but that only happened a few years ago.
> We used to use spinning hard disks, and we were so often waiting for them to open programs.
Indeed, SSDs are much faster than HDDs. That is part (but not all) of how computers have gotten faster. And yet we still wait just as long or longer for common applications to start up.
> the iPhone and iPad actually comes to mind as devices that perform well and are practically always snappy
Terribly written programs are perfectly common on iP* and can certainly be slow. But you're right, having a high-end device does make the bloat much less noticeable.