I studied experimental condensed-matter physics and this article is spot on. The PIs at my university were holding clouds of rubidium atoms with laser tweezers at microkelvins in ultra-high vacuum, agitating them with a laser beam, and publishing papers saying "look it responds sort of like a qubit!"
If you have seen the kind of equipment required to perform these experiments, it's absolutely unimaginable that these concepts could be miniaturized enough that someone would be able to put them in a desktop size box, and to do so usefully and safely within a timeline that is competitive with the advancement of microelectronics.
I think the most realistic goal was always a main-frame type computer that could perform computations for special purposes. Not necessarily to shrink it down to phone-size or even a desktop. But maybe the marketing did really have this objective.
They do not offer anything of use. This is just a demo for an API that in principle can be used as an "assembly language" for quantum computers, but it either runs on simulators that can not do more than 10 "perfect" qubits, or on hardware that has 100-ish "noisy" qubits. You need about a million "noisy" qubits, together with error correction codes, to encode 1000 logical "perfect" qubits. That is about the minimal number at which you can do something useful.
Cost of brute-forcing certain things (like NP-hard^W^W NP-complete problems) on regular computers quickly exceeds the cost of building and operating a quantum computer. Hence all the trying to make these contraptions work on non-toy problems.
We don't know of a single NP-hard problem where quantum computers would show any exponential speed-up (there may be some speed-up by using the O(sqrt(n)) brute search quantum algorithm instead of the classical O(n) search algo, but that will be quickly drowned out by the exponential factors).
Even worse, there is no reason to think that there will ever be - as far as we know, QCs only show an exponential advantage on problems with very very specific structures, while the whole problem of NP-hard problems is that they have no structure in general.
I thought there were concerns about public key cryptography - factoring large numbers much faster with QC at least sounds plausible, but maybe it’s just speculation ?
That is true, but integer factorization is not an NP-hard problem, as far as we know at least. A quantum algorithm, Shor's algorithm, is indeed known to be able to solve it in polynomial time if you have a QC.
It is suspected to be NP but not even NP-complete, nevermind NP-hard. It is suspected not to be in P, but that is not yet proven.
We're moving on to quantum-resistant public key cryptography, so unless some QC hardware breakthrough appears really soon, we'd expect quantum computers to be useless against public key cryptography because we won't rely on the difficulty of factoring large numbers as a security measure by the time large enough quantum computers appear.
From what I've read on Scott Aaronson's blogs, NP-complete is also wrong. The general suspicion seems to be that polynomial quantum algorithms (BQP) are separate from NP - they are suspected to be neither a subset nor a superset of NP.
Some problems in BQP are suspected to be in NP (integer factorization, for which the best known classical algorithm is sub-exponential, but we have a polynomial time quantum algorithm), but there is no known NP-complete problem for which a quantum algorithm is known, or even suspected to exist.
> If we interpret the space of 2n possible assignments to a Boolean formula φ as a “database,” and the satisfying assignments of φ as “marked items,” then Bennett et al.’s result says that any quantum algorithm needs at least ∼n/2 steps to find a satisfying assignment of φ with high
probability, unless the algorithm exploits the structure of φ in a nontrivial way. In other words, there is no “brute-force” quantum algorithm to solve NP-complete problems in polynomial time, just as there is no brute-force classical algorithm.
That is a bit too strong a claim (even taking "NP" to mean "NP - P", since all P problems are also in NP).
First of all, while not proven, it is considered most likely that integer factorization is not in P, so potentially we already know of 1 NP-P problem which can have an exponential speed-up from a QC (Shor's algorithm).
Secondly, there is one non-exponential speedup that can potentially apply to even NP-complete problems - using Grover's algorithm to find an element in an unordered list with complexity O(sqrt(n)) instead of the classical O(n).
Isn't the whole problem in there - as n - somehow this assumes that n-qubits can magically grow without a fuss - what if to entangle more quibits you need to cool the whole system down exponentially/using exponentially more energy? All those O(...) notations flip over, don't they?
Assuming the current version of Quantum Mechanics is correct, no. Given that we know QM is not compatible with General Relativity, which means one or both of them must be wrong, we could potentially discover some limitation of QM while investigating this.
Ironically, if that were to happen, it would probably be a much more important boon for humanity than if we successfully build a working QC.
There isn’t actually any reason to believe one of them “is wrong” (we know they’re both approximations.) We just think that the macro and micro should follow the same fundamental laws because that’s how things have been thought to work so far.
It could very well be that they are both great approximations and it’s actually the underlying information structure that shifts depending on scale. This doesn’t seem likely to us perhaps, but only because of existing intuition which we know is likely wrong at some level.
Sure,"wrong" is a bit strong. I mean "wrong" in the same sense as Newtonian mechanics is wrong, which is exactly that it is an excellent approximation in some domains, but fails in others.
> It could very well be that they are both great approximations and it’s actually the underlying information structure that shifts depending on scale.
Right now, both QM and GR claim that they apply at any scale. If it turns out that the laws of physics change with scale, that means that both QM and GR are wrong, even though they may each be perfectly correct at the scale they have been seen to work so far.
>both QM and GR claim that they apply at any scale
the thing is I don't believe that either does make such a claim. I believe certain people have said that and the untrained masses may assume that's the case. But I don't think the scholarly proponents or intellectual founders of either system made such a claim (in fact Newton was religious and Einstein believed we were way off by his death.)
>that means that both QM and GR are wrong
How though? They are both right for their use case so are likely subsets of a greater theory.
The equations of both GR, QM and classical mechanics have no "scale" parameter, so they are in fact claiming they apply at every scale in the most important way - in the math.
Not only does the math apply at any scale, but no one has any idea how to add a scale parameter to prevent it from doing so, or what value that parameter should have. QM at least has the Measurement Postulate that could allow this to fit, but no scale is added.
Note that when I say "a scale parameter", I'm referring to something like the sqrt(1-v²/c²) of special relativity, but for "size", added to the Schrodinger equation and to Einstein's equations, that would mean they take the "scale" of the phenomenon into account. Without such a parameter, the equation says that it applies to a star as well as to a neutron. The only reason we don't apply them that way is that we have already tried and we know they give the wrong results.
Also, both GR and QM give the right results if applied at the scales of day to day life. You can use the Schrodinger equation and the Born postulate to compute where two trains traveling in opposite direction with some speed will meet, or you can use Einstein's field equations, and you'll get the same response within some small margin or error (with some reasonable assumptions, such as an almost flat spacetime in the area).
Furthermore, there are at least significant numbers of QM practitioners who do believe that QM applies at any scale - those who believe in the Many Worlds Interpretation, which states this very explicitly. On the GR side, the limitations of GR if applied at subatomic scales are well accepted and considered a flaw in the theory - which is why people hope to replace it with a theory of quantum gravity.
For civilian applications, sure. But Shor’s seems like a good enough reason for defense and defense have deep enough pockets to get this somewhat rolling.
I've actually pushed rubidium ions around in UQ's setup, and a bunch of us had a tour and Q&A with the founder. It's incredibly detailed stuff (insane optical tables, and weeks tracking down a stray hair that was contanimating the high vacuum system). I'm still on the sceptical side, but some of the tricks they use to move things with atomic-level precision make it hard to imagine there won't be something useful coming out of it.
Whether it will ever justify the investment is, of course, another question.
We see a similar thing with Ai / ML where there is a huge amount of hype but applications of quantum computation are only restricted to searching an unstructured database, finding prime factors of a number , solving a linear system of equations , computing knot invariants and the obvious one which is quantum simulation. [1]
It would be better just exposing this as a library of functions and then hooking it up to a cloud service to solve. Which Amazon, Microsoft and IBM have. Microsoft and IBM are using their own hardware And Amazon is reselling other providers. [2,3,4,5]
Researching post quantum cryptography algorithms are already on their way [7] but most likely feasible quantum computers are 80 years away when I was reading a great deal of quantum algorithm papers as a class and I asked the professor how long it would take.
The interesting strategy if you were to hack a organization which has encrypted backups would be to exfiltrate the backups and then wait for a quantum computer that could break it which is why post quantum encryption needs to be researched but the algorithms involved are still in their early stages.
State of quantum computing is much worse than AI/ML.
AI/ML easily demonstrates superiority - from playing games, classification, translation, generative art etc.
QC is stuck at no practical use with claims that it'll stay this way for decades, some claiming forever as there may be physical walls that can't be broken.
There were less than 40 years between the first integrated circuit and the world wide web. I mostly share your QC skepticism, but as a bet this would be risky.
I would argue against QC from a theoretical standpoint (i.e. will the asymptote of EC be enough to beat the asymptote of noise?) not from a technological one, recall that not long ago, a single transistor was a whole experiment on its own and today there's single chips with a trillion (!) of them.
On the other hand, from the moment the idea of computing was first used as a top secret weapon until it became a commercially available solution was around 10-15 years (in 1936 Turing wrote his thesis on computability, in 1938, the Polish mathematicians researching Enigma built their "Bomba" machines to help with decryption, by 1948 the first working electronic computer was created, the Baby, and by 1951, you had UNIVAC1 which was being sold to corporations). Quantum computing has not advanced anywhere near this fast (because it is a much much harder problem). Past pace of technological advance can't be used to predict future pace in a quite different field.
Right, think of fission fast progress but fusion very slow progress. Fission and fusion are very much about the same thing so they should be as easy right? But they're not.
There were simple computational devices going back centuries and depending on the definition even millennia. The axioms of computation were also entirely formalized and proven long before electronic computers were born
Also, for many of the things we would use a QC for we have digital computers, and they pick most of the low hanging fruit... this was not the case when digital computers started out.
The original computers were big because they were mechanical, and they shrunk by being reduced to electric and due to more precise engineering.
QC are big because the energy levels are so high that they require complex equipment to focus energy and remove heat, akin to the tyranny of the rocket equation.
There's a bunch of different modalities that have varying possibilities of being miniaturized, but neutral atom is probably the hardest (that I'm aware of). It's important to remember that basically no one who takes QC seriously thinks we're going to have quantum in smartphone-sized devices any time soon. It's really better to think of them as specialized co-processors than as general purpose computational units, of which we're more in the "mainframe" era than the "smartphone" era.
The scalability of quantum computing tech is a major concern but no one is claiming that it'll be miniaturize to that extent nor that this is even a goal.
If computational tech cannot be miniaturized then it is dead, as only at massive production quotas does the technology become useful enough to become inexpensive. This is exactly what happened with normal (non quantum) computers. Imagine if computers were still the size of a room... Terrifying thought
You're right that it does need to be miniturized to some extent but there's absolutely no need for it to be desktop sized. If in the future a quantum computer is built, taking up multiple rooms is not an issue at all.
The value is in a quantum computer existing at all, not it's availablity to consumers.
I once saw a cut out of a D-wave computer (not a "real" quantum computer but also cryogenic). The whole device is roughly the size of a old school mainframe - a few wardrobes. But it's mostly layers of insulation and cooling. The business bit inside fits in your hand. Or so I remember at least.
Actual cooling (stuff that pumps liquid nitrogen and helium) was external to all this.
I'm pretty skeptical about the whole field myself, but miniaturization would presumably come via some new (disruptive) hardware technology. IC's weren't made by miniaturizing vacuum tubes.
Of course (back to skepticism) it's not like no one thought to try using quantum mechanics and history is starting over at the 60's. Modern QC research comes after decades of ideas failing, throughout more recent times where we have been much more technologically knowledgeable versus the early days of computing.
I'm not sure, I wonder if there are some back of the envelope estimates that could be done to see how small you could theoretically shrink that portion. At least when you make it small the materials cost is minimized!! Could make it out of exotic materials and it might still be cost effective. I wouldn't worry too much about safety once its small, there is minimal amount of harm caused by ultra cold or ultra high vacuum.
> I wouldn't worry too much about safety once its small, there is minimal amount of harm caused by ultra cold or ultra high vacuum
In a previous life I worked with NMR machines, the ones with superconducting magnets cooled by liquid helium which is itself cooled by liquid nitrogen.
I would dispute "minimal amount of harm", part of the our training involved what to do if the magnet quenches, I recall "run for the exit before you suffocate" was basically the SOP...
Anyway, they were loads of fun to work with, I won't ever forget that time I nearly had my house keys snatched out of my hand by one, but back then (25 years ago) they occupied entire rooms. AFAIK they still do.
OT but I had an MRI a couple of weeks ago, and forgot to take off my gold wedding band. I could distinctly feel the magnetic field pulsing in my ring as the scan started. After a brief moment of sheer panic I realised it wasn't a problem ... and as I lay there I was idly wondering about just how much gold was in my ring :)
> I would dispute "minimal amount of harm", part of the our training involved what to do if the magnet quenches, I recall "run for the exit before you suffocate" was basically the SOP...
Yes, at current scale it would be very hazardous, but at miniature scale a gram of liquid helium could do how much damage considering it would have to make its way through the internals of a machine to contact skin?
Gotta admit the transistorization of the computer and the exponential decay of size required for the transistor, and exponential decay in duration of the transistor to FO4 (switch states, it's much less than the clock cycle) was beautiful while it lasted. Like now it's back to the drawing board.
.
Same as airplanes, basically the same since 1960, like we fly on Super Fortresses with the bomb bays replaced with cargo holds...like different dispenser, and the plexiglass fishbowl artillery in the front done differently. I would love to be in one of those fishbowls, like all exposed flying at the horizon like panoramic view. So suicidal, like all aviation.
Like not getting shot at like in Catch-22 though. Hopefully.
> Same as airplanes, basically the same since 1960, like we fly on Super Fortresses with the bomb bays replaced with cargo holds
Nit picking, the B-29 Superfortress was a propeller driven aircraft [1]. Modern commercial planes generally have jet engines. Jet engines represent a leap forwards in aerospace engineering.
If you have seen the kind of equipment required to perform these experiments, it's absolutely unimaginable that these concepts could be miniaturized enough that someone would be able to put them in a desktop size box, and to do so usefully and safely within a timeline that is competitive with the advancement of microelectronics.