I think there are two distinct phenomena happening:
1) Rate of change in fundamental technology.
2) Feature churn.
I don't think #1 is actually changing all that fast, compared to previous decades. Consider the period 1991-2001 and the period 2011-2021. I think technology change in the former was much, much faster than in the latter. A typical PC went from {486, DOS, 1-4MB RAM} to {Pentium 4, WinXP, 256MB-1GB RAM}. Linux had only just launched in 1991. ~Nobody had a cellphone in 1991. ~Nobody was on the internet in 1991.
But look at 2011-2021, and is anything really that different? Computers are faster, but it's nothing like the growth rate of the 90s. iPhones, Bitcoin, GPUs, broadband, cloud, Minecraft ... we had all these in 2011. They're just incrementally better now.
Fundamental tech is still incrementing but revolutions are few and far between.
#2, on the other hand, is in its golden age. And it's all for the wrong reasons, largely articulated by others on this thread. My addition: our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.
> our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.
Great insight. I used to put it snarkily as "developers gotta develop" but your description is clearer and I think gets at the root of the problem: Companies have all these developers sitting in chairs, and they need them to do something. But the Good Idea faucet is not flowing sufficiently to keep everyone busy, so instead of 1. having the developers fix bugs, improve quality, tighten security, etc. or 2. declaring the product finished and moving on to another one, companies are choosing to 3. Turn on the "Bad Idea" faucet and just keep changing for the sake of changing. Almost every major "legacy" product 5+ years old is in this churn-for-the-sake-of-churning phase, and as a user it's awful.
1) Rate of change in fundamental technology.
2) Feature churn.
I don't think #1 is actually changing all that fast, compared to previous decades. Consider the period 1991-2001 and the period 2011-2021. I think technology change in the former was much, much faster than in the latter. A typical PC went from {486, DOS, 1-4MB RAM} to {Pentium 4, WinXP, 256MB-1GB RAM}. Linux had only just launched in 1991. ~Nobody had a cellphone in 1991. ~Nobody was on the internet in 1991.
But look at 2011-2021, and is anything really that different? Computers are faster, but it's nothing like the growth rate of the 90s. iPhones, Bitcoin, GPUs, broadband, cloud, Minecraft ... we had all these in 2011. They're just incrementally better now.
Fundamental tech is still incrementing but revolutions are few and far between.
#2, on the other hand, is in its golden age. And it's all for the wrong reasons, largely articulated by others on this thread. My addition: our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.