> 20 years is equivalent to a couple of geological ages in tech
I disagree with this. I'm not going after you specifically, but I think this attitude is part of why the IT industry seems to reinvent the wheel every few years; there's this perception that we're GOING WHERE NOBODY HAS GONE BEFORE. No, most of us are not. Maybe a very few people in research labs are, or people really pushing at the raw edge of cryptography or mathematics, but the rest of us are basically cycling through the same ideas over and over, in different clothing.
I can't tell you how many times I've run into a problem and done some research, and found out that the optimal practical solution or algo was devised by some dude working at IBM in the 60s. (In fairness, some of those guys were really ahead of their time.) A person could make a very good living just strip-mining old ACM research papers from the 80s and selling the ideas in proof-of-concept form to the government, military, investors, or anyone else with no sense of history.
Sometimes I wonder how much further we'd get if we did a better job building on prior efforts and resisting the urge to clean-slate things quite so often.
This article is from 2014 but hardly anything has changed.
https://www.comsol.com/blogs/havent-cpu-clock-speeds-increas...
https://superuser.com/questions/543702/why-are-newer-generat...
https://i.stack.imgur.com/z94Of.png