I saw a slide somewhere recently that showed that although compute per individual CPU/GPU is reaching an upper bound, the compute per dollar is still growing exponentially.
Gave me some hope that we'll still be able to do cool things in the coming decades even without some giant leap like quantum computers etc (although it's not as cool to have the compute in the cloud vs on your local machine, but it's something).
Quantum computers are not faster classical computers. They solve some problems exponentially faster than classical computers. For example we can break RSA keys with a quantum computer because we have a quantum algorithm (Shor's algorithm) to solve this problem.
We do not have many quantum algorithms that solve interesting problems exponentially faster. Finding these algorithms is not an easy task and we do not expect all problems to be solvable this way.
Disclaimer: I'm a researcher in quantum algorithms.
That threat has been looming over me ever since I had my commodore 64 in the nineteen eighties. I'm sure it will hit some theoretical limit at some point in the future. But I wouldn't go as far as to predict next year for that.
Of course we have been approaching nano scale now for a while and my 2012 mac was only able to perform at 70% of the build speed of my 2017 model, which in turn is about the same as the cheap Linux laptop I picked up a few months ago. GPUs are one area where things are still improving rapidly because you increase performance by simply having more cores.
My cheap laptop is impressive in the sense that it does what it does without thermal throttling or even heating up a lot. Not bad for a cheap 700 Euro i5 laptop. My 2017 mac book pro was struggling with keeping things cool.
The next leap is going to be an exponentially larger number of CPU cores. We've been stuck at 4-16 or so for the last decade or so. There's no other reason for this other than legacy compilers, languages and CPU architectures. Leveraging concurrency is just hard. GPUs kept on increasing number of cores and have been doubling fps for the same job much more reliably.
All of course amazingly quick compared to my trusty old C64.
Gave me some hope that we'll still be able to do cool things in the coming decades even without some giant leap like quantum computers etc (although it's not as cool to have the compute in the cloud vs on your local machine, but it's something).