I would argue against QC from a theoretical standpoint (i.e. will the asymptote of EC be enough to beat the asymptote of noise?) not from a technological one, recall that not long ago, a single transistor was a whole experiment on its own and today there's single chips with a trillion (!) of them.
On the other hand, from the moment the idea of computing was first used as a top secret weapon until it became a commercially available solution was around 10-15 years (in 1936 Turing wrote his thesis on computability, in 1938, the Polish mathematicians researching Enigma built their "Bomba" machines to help with decryption, by 1948 the first working electronic computer was created, the Baby, and by 1951, you had UNIVAC1 which was being sold to corporations). Quantum computing has not advanced anywhere near this fast (because it is a much much harder problem). Past pace of technological advance can't be used to predict future pace in a quite different field.
Right, think of fission fast progress but fusion very slow progress. Fission and fusion are very much about the same thing so they should be as easy right? But they're not.
There were simple computational devices going back centuries and depending on the definition even millennia. The axioms of computation were also entirely formalized and proven long before electronic computers were born
Also, for many of the things we would use a QC for we have digital computers, and they pick most of the low hanging fruit... this was not the case when digital computers started out.
The original computers were big because they were mechanical, and they shrunk by being reduced to electric and due to more precise engineering.
QC are big because the energy levels are so high that they require complex equipment to focus energy and remove heat, akin to the tyranny of the rocket equation.