Hacker News new | past | comments | ask | show | jobs | submit login
Quantum winter is coming (backreaction.blogspot.com)
297 points by nsoonhui on Nov 5, 2022 | hide | past | favorite | 273 comments



From the article: "The record breaking “useful” calculation for quantum computers is the prime-number factorization of 21. That’s the number, not the number of digits. Yes, the answer is 3 times 7, but if you do it on a quantum computer you can publish it in Nature. In case you are impressed by this achievement, please allow me to clarify that doing this calculation with the standard algorithm and error correction is way beyond the capacity of current quantum computers. They actually used a simplified algorithm that works for this number in particular."

Oh. I thought things were further along than that.


It's wild that I've a seen a few linkedin invites and even job posting for "quantum computing AI". The tone is always along the lines of building up your quantum computing software development skills since it will be the next big thing in industry.

I've worked with quantum computing researchers before (I mean in the same building not doing the work), it's interesting work, but we're still at the stage where a focused background in physics doing research in is the prereq, not skills with quantum algorithms and their implementations. "Programming quantum computers" is still physics not software engineering.


They’re evidently selling shovels for the gold rush, before the gold is found


There have been a lot of claims of larger numbers, but they all rely on some combination of (a) "classical preprocessing", where you find the answer on a normal computer first and use it as an input to the quantum algorithm, and/or (b) selecting numbers with extremely specific mathematical properties.


Apparently that's what it is and noise seems to be a huge problem - as it seems to be growing exponentially? Think crazy cool fridges that need to be orders of magnitude better, not just linearly better, if you want to entangle more qubits.


I thought we were still factoring 4 tbh. No sarcasm.

Dumb question, is 21 harder than 4?


Generally problems that require more bits are "harder"


I get the notion about structuring problems so that probability amplitudes cancel out incorrect and unnecessary steps to solve the problem. But I struggle to follow the logic of how it appears in factoring. Are there any more simplistic examples where this is easier to grok?


As a person who works in Quantum Computing, I can concur that there's hype in the field; one of the professors in my university has a quantum machine learning (TM) startup that seems to be performing well, even though most of the faculty and grad students can tell you that it's bullshit.

However. This paragraph straight up displays a fundamental lack of understanding by Hossenfelder:

> Last time I looked, no one had any idea how to do a weather forecast on a quantum computer. It’s not just that no one has done it, no one knows if it’s even possible, because weather is a non-linear system whereas quantum mechanics is a linear theory.

Unitary evolution generated by the Schrödinger equation is a linear map on _probability amplitudes_, just like how classical (probabilistic) computing performs linear operations on _probability distributions_. The commonly used quantum circuit model is a superset of classical logic gates and can accomplish anything a probabilistic classical computer can, so if anything is possible in a classical computing scheme, it's also possible in the quantum circuit scheme.

I don't have much sympathy for her since this is not the first time Hossenfelder has displayed a lack of understanding, recently she has published a paper criticizing another one [1], now replaced with a much shorter text due to being told [2] by the authors of the original paper.

Yeah I get it, it's dumb when the president of BofA is talking about how QC is "the next big thing", I know it's not coming Soon^TM, but saying "we will never have a quantum computer because the current ones suck" has the same energy as "the world doesn't need more than 5 computers" imo.


> Unitary evolution generated by the Schrödinger equation is a linear map on _probability amplitudes_, just like how classical (probabilistic) computing performs linear operations on _probability distributions_. The commonly used quantum circuit model is a superset of classical logic gates and can accomplish anything a probabilistic classical computer can, so if anything is possible in a classical computing scheme, it's also possible in the quantum circuit scheme.

To predict weather on a computer, we need to run large CFD simulations. When we do this on a classical computer, this involves a discretization of a system of PDEs with millions or billions of degrees of freedom, requiring 4 or 8 bits per floating point number. It may be possible to do the same CFD simulations on a quantum computer, but this is several constrained by the small number of qubits currently available on quantum computers. And clearly, even if you could run the same algorithm, presumably the point of using a quantum computer would be reap the "quantum advantage" in order to do something algorithmically superior to what's possible on a classical computer.

I think this is a pretty small point to get hung up on. The rest of her article is perfectly reasonable.


It really is not because this is a semi-common misconception and to me signals basic unfamiliarity with the subject. Schrodinger's eqn being linear has nothing to do with implementing non-linear functions. There's no question about how you would implement the said logic in a quantum computer - you can just do what the classical implementation does. Yeah we don't have nowhere enough qubits and it would be a gross waste of resources, but we _do_ know how to do it. Saying they don't know how to do it is a false statement at best. Again, I'm not saying we're going to be solving weather models with a quantum computer soon - even though I know folks are working on QC algorithms for (nonlinear) PDE solutions.


> Schrodinger's eqn being linear has nothing to do with implementing non-linear functions.

Indeed. And it's possible SH is confused by this, since she had another video about quantum chaos in asteroids where similar observation applied and she didn't address that. However...

> There's no question about how you would implement the said logic in a quantum computer - you can just do what the classical implementation does. Yeah we don't have nowhere enough qubits and it would be a gross waste of resources, but we _do_ know how to do it. Saying they don't know how to do it is a false statement at best.

Here you're being a little uncharitable. Indeed theoretically one could make the quantum computer simulate the classical computer with the non-linear weather algorithm. But the interesting point Mrs. Hossenfelder may be making here is there is no known way to make quantum computers calculate/simulate the weather evolution in a "quantum computer way", that is, not simulating discrete-state classical computer which would be wasteful and most probably not with advantage, but realizing the differential equation evolution in analog mode, using the quantum superposition capabilities. That is not known to be possible. Quantum computer may be an analog computer (continuous evolution of state), but it is not clear how to use it to integrate interesting sets of differential equations like weather models.


This paper has a very interesting claim, that they can integrate arbitrary non-linear differential equations on quantum computer, with advantage. If their analysis is correct, this makes the case for weather prediction on quantum computers much stronger.

https://arxiv.org/pdf/2011.06571.pdf


> 4 or 8 bits per floating point number

Bytes, not bits.


Typo... thanks.


Yeah the "superset" perspective seems like you could end up cheating, implementing the equivalent of classical logic and probably much more slowly than a classical computer could do it.

I googled around and found this research though, which does propose using a nonlinear quantum system: https://arxiv.org/abs/2210.17460. It doesn't really claim the issue is solved.


> I think this is a pretty small point to get hung up on. The rest of her article is perfectly reasonable.

The above isn't the only place that betrays her lack of understanding, though.

For instance, she confidently writes "Ion traps are used for example by IonQ and Honeywell. They must “only” be cooled to a few Kelvin above absolute zero," but this is just wrong; trapped-ion qubits do not, a priori, require cryogenic cooling. Yes, lowering the temperature can be useful for incidental reasons, as it improves the vacuum quality and reduces some technical excess noise sources, but this is simply an engineering choice. Many of the high-profile results in trapped-ion quantum information processing were in fact achieved in room-temperature systems. And even if one does opt for cryogenic cooling, the ~tens of Kelvin regime of interest here is incomparably easier to reach than the tens of milli-Kelvin required for superconducting qubits and other solid-state spin platforms (where those elaborate dilution refrigerator "chandeliers" are actually required to keep the qubits intact). In fact, in ratiometric terms, the temperatures of interest are actually closer to room temperature than to that millikelvin regime!

Like many physicists, I'd naturally be inclined to agree with Sabine Hossenfelder as far as her distaste of marketing hype is concerned, but in making authoritative-sounding statements without having the knowledge to back them up, and misrepresenting what one would hope she knows are the actual scientific facts in the service of a punchy script, she is hardly doing any better than those private-sector hype evangelists she ridicules. Beware of Gell-Mann Amnesia…


> Yeah I get it, it's dumb when the president of BofA is talking about how QC is "the next big thing", I know it's not coming Soon^TM, but saying "we will never have a quantum computer because the current ones suck" has the same energy as "the world doesn't need more than 5 computers" imo.

From the outside QC looks looks less like traditional computing (as you're suggesting) and more like cold fusion. There are plenty of hopeful stories and investments but it's hard to tell if it'll ever happen in a meaningful way.


You left out the footnote links



>> one of the professors in my university has a quantum machine learning (TM) startup

I'm guessing you go to UofT or Waterloo.


I studied experimental condensed-matter physics and this article is spot on. The PIs at my university were holding clouds of rubidium atoms with laser tweezers at microkelvins in ultra-high vacuum, agitating them with a laser beam, and publishing papers saying "look it responds sort of like a qubit!"

If you have seen the kind of equipment required to perform these experiments, it's absolutely unimaginable that these concepts could be miniaturized enough that someone would be able to put them in a desktop size box, and to do so usefully and safely within a timeline that is competitive with the advancement of microelectronics.


I think the most realistic goal was always a main-frame type computer that could perform computations for special purposes. Not necessarily to shrink it down to phone-size or even a desktop. But maybe the marketing did really have this objective.


Huh I’m not an expert but doesn’t AWS already offer quantum computing as a service?

https://aws.amazon.com/braket/


They do not offer anything of use. This is just a demo for an API that in principle can be used as an "assembly language" for quantum computers, but it either runs on simulators that can not do more than 10 "perfect" qubits, or on hardware that has 100-ish "noisy" qubits. You need about a million "noisy" qubits, together with error correction codes, to encode 1000 logical "perfect" qubits. That is about the minimal number at which you can do something useful.


Once we get to 640,000 perfect qubits we'll be done, because no one needs more than 640k.


All true, but in the context of the thread, the future of quantum computing will probably be remote. So the bulk of the machinery is irrelevant.


Unless the bulk required to be useful is infeasible to assemble anywhere.


You'll need to build a Dyson Sphere to power your quantum chip factory!

https://dyson-sphere-program.fandom.com/wiki/Quantum_Chip

QUANTUM CHIPS Running low? Never Again! | Dyson Sphere Program Master Class

https://www.youtube.com/watch?v=O-xAZj0C2yo


I thought this was simulated quantum annealing.


Problem is, practically any special purpose can probably be done more cost effectively by brute forcing with conventional computing.


Cost of brute-forcing certain things (like NP-hard^W^W NP-complete problems) on regular computers quickly exceeds the cost of building and operating a quantum computer. Hence all the trying to make these contraptions work on non-toy problems.


We don't know of a single NP-hard problem where quantum computers would show any exponential speed-up (there may be some speed-up by using the O(sqrt(n)) brute search quantum algorithm instead of the classical O(n) search algo, but that will be quickly drowned out by the exponential factors).

Even worse, there is no reason to think that there will ever be - as far as we know, QCs only show an exponential advantage on problems with very very specific structures, while the whole problem of NP-hard problems is that they have no structure in general.


I thought there were concerns about public key cryptography - factoring large numbers much faster with QC at least sounds plausible, but maybe it’s just speculation ?


That is true, but integer factorization is not an NP-hard problem, as far as we know at least. A quantum algorithm, Shor's algorithm, is indeed known to be able to solve it in polynomial time if you have a QC.

It is suspected to be NP but not even NP-complete, nevermind NP-hard. It is suspected not to be in P, but that is not yet proven.


Integer factorization is obviously in NP, though as you say, whether it is in the P subset of NP is still an open question.


I was trying to be careful with my language specifically to avoid this mistake, but I still messed up...


It's tricky!


We're moving on to quantum-resistant public key cryptography, so unless some QC hardware breakthrough appears really soon, we'd expect quantum computers to be useless against public key cryptography because we won't rely on the difficulty of factoring large numbers as a security measure by the time large enough quantum computers appear.


Indeed, NP-hard is wrong! NP-complete is the interesting class where quantum computers theoretically could help.


From what I've read on Scott Aaronson's blogs, NP-complete is also wrong. The general suspicion seems to be that polynomial quantum algorithms (BQP) are separate from NP - they are suspected to be neither a subset nor a superset of NP.

Some problems in BQP are suspected to be in NP (integer factorization, for which the best known classical algorithm is sub-exponential, but we have a polynomial time quantum algorithm), but there is no known NP-complete problem for which a quantum algorithm is known, or even suspected to exist.

Edit - some links:

[0] https://www.scottaaronson.com/papers/npcomplete.pdf - chapter 4

> If we interpret the space of 2n possible assignments to a Boolean formula φ as a “database,” and the satisfying assignments of φ as “marked items,” then Bennett et al.’s result says that any quantum algorithm needs at least ∼n/2 steps to find a satisfying assignment of φ with high probability, unless the algorithm exploits the structure of φ in a nontrivial way. In other words, there is no “brute-force” quantum algorithm to solve NP-complete problems in polynomial time, just as there is no brute-force classical algorithm.

[1] https://youtu.be/0jrybODBUpA?t=30m28s "P versus NP"


No. Quantum computers are not known to provide any speed up to any NP problems.


That is a bit too strong a claim (even taking "NP" to mean "NP - P", since all P problems are also in NP).

First of all, while not proven, it is considered most likely that integer factorization is not in P, so potentially we already know of 1 NP-P problem which can have an exponential speed-up from a QC (Shor's algorithm).

Secondly, there is one non-exponential speedup that can potentially apply to even NP-complete problems - using Grover's algorithm to find an element in an unordered list with complexity O(sqrt(n)) instead of the classical O(n).


Isn't the whole problem in there - as n - somehow this assumes that n-qubits can magically grow without a fuss - what if to entangle more quibits you need to cool the whole system down exponentially/using exponentially more energy? All those O(...) notations flip over, don't they?


Assuming the current version of Quantum Mechanics is correct, no. Given that we know QM is not compatible with General Relativity, which means one or both of them must be wrong, we could potentially discover some limitation of QM while investigating this.

Ironically, if that were to happen, it would probably be a much more important boon for humanity than if we successfully build a working QC.


There isn’t actually any reason to believe one of them “is wrong” (we know they’re both approximations.) We just think that the macro and micro should follow the same fundamental laws because that’s how things have been thought to work so far.

It could very well be that they are both great approximations and it’s actually the underlying information structure that shifts depending on scale. This doesn’t seem likely to us perhaps, but only because of existing intuition which we know is likely wrong at some level.


Sure,"wrong" is a bit strong. I mean "wrong" in the same sense as Newtonian mechanics is wrong, which is exactly that it is an excellent approximation in some domains, but fails in others.

> It could very well be that they are both great approximations and it’s actually the underlying information structure that shifts depending on scale.

Right now, both QM and GR claim that they apply at any scale. If it turns out that the laws of physics change with scale, that means that both QM and GR are wrong, even though they may each be perfectly correct at the scale they have been seen to work so far.


>both QM and GR claim that they apply at any scale

the thing is I don't believe that either does make such a claim. I believe certain people have said that and the untrained masses may assume that's the case. But I don't think the scholarly proponents or intellectual founders of either system made such a claim (in fact Newton was religious and Einstein believed we were way off by his death.)

>that means that both QM and GR are wrong

How though? They are both right for their use case so are likely subsets of a greater theory.


The equations of both GR, QM and classical mechanics have no "scale" parameter, so they are in fact claiming they apply at every scale in the most important way - in the math.

Not only does the math apply at any scale, but no one has any idea how to add a scale parameter to prevent it from doing so, or what value that parameter should have. QM at least has the Measurement Postulate that could allow this to fit, but no scale is added.

Note that when I say "a scale parameter", I'm referring to something like the sqrt(1-v²/c²) of special relativity, but for "size", added to the Schrodinger equation and to Einstein's equations, that would mean they take the "scale" of the phenomenon into account. Without such a parameter, the equation says that it applies to a star as well as to a neutron. The only reason we don't apply them that way is that we have already tried and we know they give the wrong results.

Also, both GR and QM give the right results if applied at the scales of day to day life. You can use the Schrodinger equation and the Born postulate to compute where two trains traveling in opposite direction with some speed will meet, or you can use Einstein's field equations, and you'll get the same response within some small margin or error (with some reasonable assumptions, such as an almost flat spacetime in the area).

Furthermore, there are at least significant numbers of QM practitioners who do believe that QM applies at any scale - those who believe in the Many Worlds Interpretation, which states this very explicitly. On the GR side, the limitations of GR if applied at subatomic scales are well accepted and considered a flaw in the theory - which is why people hope to replace it with a theory of quantum gravity.


Just make an Amoeboa computer?


At least it would be something new


First thing would be to figure what that special purpose could possibly be. We're not even there yet ...


My friends in the industry tell me that predicting protein folding is a popular civilian use case.


Seems like the easiest way to use quantum mechanics to predict how a protein will fold might be to actually build that protein and watch.


yes, because watching molecules is easy…right?


For civilian applications, sure. But Shor’s seems like a good enough reason for defense and defense have deep enough pockets to get this somewhat rolling.


Grover has some plausible applications as well as Shor.

And Shor is based on quantum superior FFT if I recall correctly, which could have applications outside of discrete log.

Disclaimer: I’m not an expert on this stuff, I’m sure someone will correct me if I’m wrong because there are real pros on here.


The special purpose is figuring out what the special purpose is :)


I've actually pushed rubidium ions around in UQ's setup, and a bunch of us had a tour and Q&A with the founder. It's incredibly detailed stuff (insane optical tables, and weeks tracking down a stray hair that was contanimating the high vacuum system). I'm still on the sceptical side, but some of the tricks they use to move things with atomic-level precision make it hard to imagine there won't be something useful coming out of it.

Whether it will ever justify the investment is, of course, another question.


We see a similar thing with Ai / ML where there is a huge amount of hype but applications of quantum computation are only restricted to searching an unstructured database, finding prime factors of a number , solving a linear system of equations , computing knot invariants and the obvious one which is quantum simulation. [1]

It would be better just exposing this as a library of functions and then hooking it up to a cloud service to solve. Which Amazon, Microsoft and IBM have. Microsoft and IBM are using their own hardware And Amazon is reselling other providers. [2,3,4,5]

Researching post quantum cryptography algorithms are already on their way [7] but most likely feasible quantum computers are 80 years away when I was reading a great deal of quantum algorithm papers as a class and I asked the professor how long it would take.

The interesting strategy if you were to hack a organization which has encrypted backups would be to exfiltrate the backups and then wait for a quantum computer that could break it which is why post quantum encryption needs to be researched but the algorithms involved are still in their early stages.

Post [1] https://en.wikipedia.org/wiki/Quantum_algorithm

[2] https://quantumai.google/hardware [3] https://azure.microsoft.com/en-us/solutions/quantum-computin... [4] https://aws.amazon.com/braket/ [5] https://www.ibm.com/quantum [6] https://en.wikipedia.org/wiki/Post-quantum_cryptography?wpro... [7] https://pqcrypto.org/conferences.html


State of quantum computing is much worse than AI/ML.

AI/ML easily demonstrates superiority - from playing games, classification, translation, generative art etc.

QC is stuck at no practical use with claims that it'll stay this way for decades, some claiming forever as there may be physical walls that can't be broken.


Totally agree should have played that much more up in my comment.


My current ML project is funded because the bank regulator said it would have to be done or there would be a big fine.

I do not think that there will be a QC project done on the same basis for 40 years.


There were less than 40 years between the first integrated circuit and the world wide web. I mostly share your QC skepticism, but as a bet this would be risky.


I would argue against QC from a theoretical standpoint (i.e. will the asymptote of EC be enough to beat the asymptote of noise?) not from a technological one, recall that not long ago, a single transistor was a whole experiment on its own and today there's single chips with a trillion (!) of them.


On the other hand, from the moment the idea of computing was first used as a top secret weapon until it became a commercially available solution was around 10-15 years (in 1936 Turing wrote his thesis on computability, in 1938, the Polish mathematicians researching Enigma built their "Bomba" machines to help with decryption, by 1948 the first working electronic computer was created, the Baby, and by 1951, you had UNIVAC1 which was being sold to corporations). Quantum computing has not advanced anywhere near this fast (because it is a much much harder problem). Past pace of technological advance can't be used to predict future pace in a quite different field.


Right, think of fission fast progress but fusion very slow progress. Fission and fusion are very much about the same thing so they should be as easy right? But they're not.


Fission is just putting two rocks together, and you get energy. Fusion is a little more complicated in practice.


There were simple computational devices going back centuries and depending on the definition even millennia. The axioms of computation were also entirely formalized and proven long before electronic computers were born


True, I forgot especially about Babbage and Lovelace's contributions...

Still, my point still stands if we limit it to electronic computers, I think.


Also, for many of the things we would use a QC for we have digital computers, and they pick most of the low hanging fruit... this was not the case when digital computers started out.


The original computers were big because they were mechanical, and they shrunk by being reduced to electric and due to more precise engineering.

QC are big because the energy levels are so high that they require complex equipment to focus energy and remove heat, akin to the tyranny of the rocket equation.


60 years ago.


There's a bunch of different modalities that have varying possibilities of being miniaturized, but neutral atom is probably the hardest (that I'm aware of). It's important to remember that basically no one who takes QC seriously thinks we're going to have quantum in smartphone-sized devices any time soon. It's really better to think of them as specialized co-processors than as general purpose computational units, of which we're more in the "mainframe" era than the "smartphone" era.

> "look it responds sort of like a qubit!"

And behave like them too ;)


The scalability of quantum computing tech is a major concern but no one is claiming that it'll be miniaturize to that extent nor that this is even a goal.


If computational tech cannot be miniaturized then it is dead, as only at massive production quotas does the technology become useful enough to become inexpensive. This is exactly what happened with normal (non quantum) computers. Imagine if computers were still the size of a room... Terrifying thought


You're right that it does need to be miniturized to some extent but there's absolutely no need for it to be desktop sized. If in the future a quantum computer is built, taking up multiple rooms is not an issue at all.

The value is in a quantum computer existing at all, not it's availablity to consumers.


Miniaturization really kicked up because we needed to shoot computers into orbit and minimizing mass was a crucial optimization for that problem.

Might see a similar bout of miniaturization if we can come up with a good defensive/offensive application for putting quantum computers in orbit.


How much of that size is cryo-equipment though?


I once saw a cut out of a D-wave computer (not a "real" quantum computer but also cryogenic). The whole device is roughly the size of a old school mainframe - a few wardrobes. But it's mostly layers of insulation and cooling. The business bit inside fits in your hand. Or so I remember at least.

Actual cooling (stuff that pumps liquid nitrogen and helium) was external to all this.


That's my point; how do you propose to miniaturize cryogenic equipment (and make it safe for the general public to use)?

The UHV equipment is pretty intense too, fwiw.


I'm pretty skeptical about the whole field myself, but miniaturization would presumably come via some new (disruptive) hardware technology. IC's weren't made by miniaturizing vacuum tubes.

Of course (back to skepticism) it's not like no one thought to try using quantum mechanics and history is starting over at the 60's. Modern QC research comes after decades of ideas failing, throughout more recent times where we have been much more technologically knowledgeable versus the early days of computing.


I'm not sure, I wonder if there are some back of the envelope estimates that could be done to see how small you could theoretically shrink that portion. At least when you make it small the materials cost is minimized!! Could make it out of exotic materials and it might still be cost effective. I wouldn't worry too much about safety once its small, there is minimal amount of harm caused by ultra cold or ultra high vacuum.


> I wouldn't worry too much about safety once its small, there is minimal amount of harm caused by ultra cold or ultra high vacuum

In a previous life I worked with NMR machines, the ones with superconducting magnets cooled by liquid helium which is itself cooled by liquid nitrogen.

I would dispute "minimal amount of harm", part of the our training involved what to do if the magnet quenches, I recall "run for the exit before you suffocate" was basically the SOP...

Anyway, they were loads of fun to work with, I won't ever forget that time I nearly had my house keys snatched out of my hand by one, but back then (25 years ago) they occupied entire rooms. AFAIK they still do.

OT but I had an MRI a couple of weeks ago, and forgot to take off my gold wedding band. I could distinctly feel the magnetic field pulsing in my ring as the scan started. After a brief moment of sheer panic I realised it wasn't a problem ... and as I lay there I was idly wondering about just how much gold was in my ring :)


> I would dispute "minimal amount of harm", part of the our training involved what to do if the magnet quenches, I recall "run for the exit before you suffocate" was basically the SOP...

Yes, at current scale it would be very hazardous, but at miniature scale a gram of liquid helium could do how much damage considering it would have to make its way through the internals of a machine to contact skin?


Gotta admit the transistorization of the computer and the exponential decay of size required for the transistor, and exponential decay in duration of the transistor to FO4 (switch states, it's much less than the clock cycle) was beautiful while it lasted. Like now it's back to the drawing board.

.

Same as airplanes, basically the same since 1960, like we fly on Super Fortresses with the bomb bays replaced with cargo holds...like different dispenser, and the plexiglass fishbowl artillery in the front done differently. I would love to be in one of those fishbowls, like all exposed flying at the horizon like panoramic view. So suicidal, like all aviation.

Like not getting shot at like in Catch-22 though. Hopefully.


> Same as airplanes, basically the same since 1960, like we fly on Super Fortresses with the bomb bays replaced with cargo holds

Nit picking, the B-29 Superfortress was a propeller driven aircraft [1]. Modern commercial planes generally have jet engines. Jet engines represent a leap forwards in aerospace engineering.

[1] https://en.m.wikipedia.org/wiki/Boeing_B-29_Superfortress


I can't help but think capitalism plays a part in encouraging this kind of pathological behavior.


Quantum computation plays a starring role in some of my favorite recent sci-fi novels (Hannu Rajaniemi's Quantum Thief, for example), but like the William Gibson's Neuromancer world (powered by Eastern Seaboard Fusion Reactors), it's just interesting and speculative sci-fi.

Happily, there are many fields beside computing where quantum technology comes into play - better and cheap chip fabrication, semiconductor lasers and diodes, all kinds of materials science research, and of course, solar energy conversion systems modeled on the photosynthetic apparatus:

https://sci-hub.se/10.1038/nature22012

Romero, et al. (2017). Quantum design of photosynthesis for bio-inspired solar-energy conversion. Nature

As far as what today's working scientists will pursue, the silly popular notion that researchers are free to explore whatever they find exciting and interesting is mostly nonsense; successful researchers in the modern science system are as keen as hounds on the scent for new funding disbursements from the major federal agencies (and some private sponsors). If the money dries up, they turn their attention to other things, except perhaps for a few back-burner projects handed off to some hopelessly naive yet charmingly enthusiastic grad student.


Eastern Seabord Fission Authority ;)

Gibson got some stuff wrong, but it’s borderline scary how much he got right. Book is like 43 years old or something.


The biggest obstacle to the realization of quantum computing is not technical, imo, but theoretical.

it is well known, but not to laypeople, that a quantum computer is efficiently (quadratic overhead) simulable if it only operates on the eigenstates of the generalized Pauli matrices with the so called Clifford operators. This is a really fancy way of saying that this group action is not dense in the unitary operators, which is itself a fancy way of saying that it behaves like rolling a die, instead of like rolling a ball.

In order to achieve density in the unitaries it suffices to construct a single state that is not one of these magic states (their language, not mine), to a sufficiently high level of purity.

The much touted paper which claims to do this only succeeds in showing that the problem is equivalent to some other problem which we also do not know how to solve, (creating many, worse separable copies of this state) and there is no particular reason to believe that it can be. Moreover given what it would be able to do, it seems much more likely to me that there is a proof, waiting to be discovered, that there is a fundamental obstruction to harvesting such a state without at least waiting as long as you would have to wait to do your computation the old fashioned way.


Could you cite the "much touted paper" you're talking about?

In my view, making the noisy physical magic states is the easy part of the distillation process. You reset a qubit, then rotate it 90° around the Y axis, then 45° around the Z axis. That's the magic state. Note that the tolerance on those rotations is forgiving: getting them to within 10°, 95% of the time, is sufficient. All the error correcting code stuff that follows has fidelity requirements an order of magnitude stricter.

As you note, there'd need to be some unforeseen obstacle for state prep to be the showstopper. Given how apparently easy it is to make these states, I think any obstacle like that would basically have to falsify quantum mechanics as we know it. It would be like finding out that light can't be diagonally polarized.


The ability to rotate 45 degrees is equivalent to the construction of a magic state, as you have correctly identified, as it is not a clifford operator, and adding it to your generators gives universal quantum computation on its own. You have pushed the problem sideways.

This is the paper, https://arxiv.org/abs/quant-ph/0403025, and it is well understood by the paper that the independence of the noisy magic states is necessary for the distillation process to proceed. Note that the probability of having some entanglement between your partial states goes up rather dramatically with the number of them that you have, and not obviously in a way that you can do anything about.


Thanks for the reference.

> Note that the probability of having some entanglement between your partial states goes up rather dramatically with the number of them that you have

Entanglement is not binary, it is continuous. If you start with states like CPHASE(5°)|TT>, a few rounds of distillation will have turned them into states like CPHASE(0.0000000000000001°)|TT>. Sure the output states are "still entangled", but the amount of entanglement is so negligible that you don't have to care. Such small distortions won't prevent trillion step computations from working.


No it definitely is technical, we need a lot of materials science and RF electronics to get things to work better with each other. We need better material growth and device fabrication techniques and better readout schemes.


The Quantum Computing fiasco is fascinating and underline quite few human nature flaws.

- We're collectively less smart/rational than we think

- People with money/power are not much better than average

- We tend to be very gullible when we don't understand the underlying principles

The same human flaws can be seen in UFO/Conspiracy theories and to some extent in the crypto/NFT scene.

A lot of this is amplified to several orders of magnitude by incompetent journalism.


> - People with money/power are not much better than average

I think it's way worse than that, the only things that people with money/power are better at ... is getting / hanging on to money and/or power.

They're not better at anything apart from that, by any objective measure.


From my experience, people with higher social status (and thus money/power) tend on average to be somewhat smarter, but not by a large margin.


A lot of that is correlated with more recreational time, better family lives (probably no one in jail), and higher quality education. This is the same logic the French aristocrats used to justify the oppression of the peasants, reading novels made them "better"


They're way better than average at understanding money and power. Obviously it is you that is confused into thinking this has something to do with scientific reality. Not saying I do either...


Rather simply, unfortunate coincidences. The people with money to invest happend to be computing-related people. "Quantum Computation" contains the word computation, which they understand, and Quantum, which stands for "mysterious, cool, ingenious". So they invested in. Something like "developmental neuroscience" is nowhere nearly as cool-sounding.


Bitcoin is money the government doesn’t control. I like it but statists don’t.


Plenty of anarchists (e.g. libertarian socialists) don't like bitcoin either.


Money the government doesn't own, but can track with virtual perfect accuracy if you declare it on your taxes


People also forget that analogue computers, mechanical devices to perform calculations, can also be faster than digital computers in some situations. That doesn't make them commercially viable.

Seems to me that most all of the quantum computing community is trying to be in the right place when they can start cracking current encryption standards at a commercial scale. At that moment, anyone with a functional quantum computer will drown in money. Then a few weeks later new quantum-resistant algorithms will appear and the gold rush will end. All the other quantum projects seem like attempts to keep ones foot in the market while waiting for that day.


Analog computers have mostly been electronic rather than mechanical for 60 years. "Digital" and "electronic" are not synonyms; they are completely orthogonal. Analog computers 60 years ago were mostly built with op-amps rather than shafts and gears, despite the survival of WWII-era mechanical naval fire control computers. That's in large part because human-scale shafts and gears max out with signals in the hertz to kilohertz range, while even the most ordinary op-amps can handle signals in the tens of kilohertz range (which has been true for 100 years) and op-amps in the tens of megahertz range have been available for 60 years. Also, shafts and gears have inherent errors of around 1% from backlash, while op-amps can usually do better than 0.1% (again, for the last 100 years) and by 60 years ago better than 0.001%.

So with off-the-shelf electronics an analog computer can compute 1000 times faster with 1000 times better precision than if it were mechanical. Until the 01960s they used vacuum tubes and so used more power and were less reliable; since then electronics have used less power and been more reliable.

Today we still use plenty of analog computation, but it's pushed to the margins. Every sound card has an antialiasing analog filter on its front end before switching to the digital domain. Even software-defined radios still use analog electronics to upconvert and downconvert signals between baseband or IF and the RF. Your Wi-Fi card can't sample that 2.4 GHz signal at its 4.8 Gsps Nyquist rate; doing that is not impossible but still requires high-end digital electronics. Submillimeter-wave communication is very much dependent on precise analog signal processing to modulate your desired signal into the hundreds of GHz range.

("Precise" in this case doesn't mean with linearity errors as low as 1%.)


Quantum resistant encryption is already available


Would you be able to give any examples of quantum resistant encryption algorithms? I'm not familiar with the field and my most recent knowledge is a post on hn saying that some post quantum candidates had been broken by old laptops.


The whole symmetric key cryptography (e.g. AES) ia already quantum resistant. The problem only holds for public key encryption, but as the other commenter pointed out, there are already promising algorithms.


That's nice, I didn't realise AES was quantum resistant.

However, an algorithm being promising doesn't mean it works. Do you know how well the development of these other techniques is progressing?


I'm not an expert in the field, but there is already a NIST competition going on to stansardize post-quantum public key ciphers. So I would say that we're at a good point in post-quantum cryptography development.


It's potentially quantum-resistant depending on how it's used. Grover's algorithm still reduces your effective key length by half in many situations.



No. Encryption that is "not provably quantum-insecure" is available. I doubt this will ever extend to "provably quantum-resistant".


You've got me intrigued -- what are examples of mechanical devices being faster than digital ones? Assuming you're talking man-made.

I'm trying to imagine and am totally stumped.


Fire control computers, like on a navy ship, were faster than digital computers of the day. YouTube has a number of videos on them.

An opamp performs multiplication faster than a digital computer (speed of light vs a few cycles). It's not super useful on its own, but it does fit the criteria.

In Veritasium's video 2/2 on analog computers [0] they show some startup products near the end.

[0]https://youtu.be/GVsUOuSjvcg?t=898


What no. Opamps don’t multiply and they don’t operate at the speed of light. They have some timescale that goes like their bandwidth, which depends on their feedback path.


Yes, feedback op-amps definitely have bandwidth limits. Although you can get ones in the gigahertz range now.

Analog multiplier ICs are available.[1] They're not common, and they cost $10-$20 each. Error is about 2% worst case for that one. There are several clever tricks used to multiply. See "Gilbert Cell" and "quarter square multiplier".

[1] https://www.digikey.com/en/htmldatasheets/production/1031484...


The propagation time around the feedback loop is still (length of loop) / (speed of light*slowdown constant), so yes, "at the speed of light".


This is absolutely not true, the speed of analog circuits is (by a significant margin) determined by parasitic capacitance, inductance, and resistance of the components. To put numbers to it, a typical high performance analog multiplier might have a loop length of 1cm for the feedback path. This circuit should theoretically operate at 30GHz, but realistically such circuits operate with a bandwidth measured in megahertz.


If your values are in the mechanical domain, doing a simple and fixed computation in the mechanical domain may be more efficient. An example would be a mechanical differential in a rear-wheel drive car [1], or a swashplate in a helicopter [2].

[1]: https://en.wikipedia.org/wiki/Differential_(mechanical_devic...

[2]: https://en.wikipedia.org/wiki/Swashplate_(aeronautics)


Those aren't examples of computation; they're examples of power transmission. If you found a way to compute the same information as a swashplate or a differential with a lower-cost, higher-speed, more reliable, lower-power device, it wouldn't replace the swashplate or differential. In fact we've had such devices for over a century, because the swashplate is just multiplying two quadratrue sine waves by constants and summing them, and the differential is just adding (or subtracting).


These are bona fide examples of computation, with results immediately consumed. Computation without output is sort of pointless.

They are not very different from a computation inside an injection controller of an ICE, with its results consumed within microseconds, as motions of injection valves. They key difference is the intermediate use of an electronic computer, an MCU, instead of a purely mechanical and pretty inflexible device, the camshaft.

Certainly we could replace a swashplate with some electric or hydraulic actuators driven by an MCU if we needed to compute something more complex than what a swashplate currently computes, much as we did with the camshaft. This is not very probable though, because a new system should also work unpowered to allow auto-rotation, to say nothing of higher reliability requirements than a system for a car.


My point is that, in that scenario, what replaces the swashplate is mostly the electric or hydraulic actuators, not the MCU. If it wasn't, you'd make the swashplate mechanism much smaller, lighter, and cheaper, even if you had reliability requirements your MCU couldn't meet.

In the north-pointing chariot or the Antikythera mechanism, the differential performed a computational function, with its action of transmitting power quite peripheral to that; in your car's rear end, it performs a power-transmission function, with its action of computation quite peripheral to that.

The same situation holds with transistors. You can use a 2N7000 to toggle a light or control a relay or a motor, or you can use it for (digital or analog) computation.

If you're using it in an NMOS NOT gate or the input stage of an op-amp, you're using it for computation, and so you wish it were smaller; it would work better if it were smaller because then it wouldn't need so much energy to turn it on or off. (For analog computation, you only wish it were smaller up to a point, because at extremely small sizes that makes it more sensitive to noise, but you wish it were really a lot smaller than a 2N7000.) A 2N5457 is generally better for an amplifier input stage, and the no-longer-available discrete signal MOSFETs are probably better for NMOS NOT gates. The N-MOSFETs integrated into a chip are enormously better at computation than a 2N7000.

By the same token, though, a 2N5457 or signal MOSFET is much worse than a 2N7000 at power transmission. If you're using it to PWM a motor, you wish it were larger; it would work better if it were larger because then it would be at less risk of overheating, be more efficient at a given current level, and be able to control a bigger motor. An IRF630 is a better power MOSFET than a 2N7000; an IRF540N is better still. But they're enormously worse at computation than a 2N7000.

Helicopter swashplates and differentials are very much on the power-transmission end of the spectrum, not the computation end, even though they cannot avoid doing computation as part of their job.


> You've got me intrigued -- what are examples of mechanical devices being faster than digital ones? Assuming you're talking man-made.

You might be able to build a fluid device to test a property faster than you can simulate the fluid dynamics in full detail. Perhaps not on the first iteration, but iterating small changes to get a desired result could certainly be faster than simulating it, for simple systems.


> The Water Integrator was an early analog computer built in the Soviet Union in 1936 by Vladimir Sergeevich Lukyanov. It functioned by careful manipulation of water through a room full of interconnected pipes and pumps. The water level in various chambers (with precision to fractions of a millimeter) represented stored numbers, and the rate of flow between them represented mathematical operations. This machine was capable of solving inhomogeneous differential equations.

https://www.techspot.com/trivia/97-1930s-which-countries-bui...

https://en.wikipedia.org/wiki/Water_integrator


Here’s an analogue computer called the MONIAC from ~1949 that calculates monetary flow in an economy: https://www.engineeringnz.org/programmes/heritage/heritage-r... with a fairly naff video of it operating but captures the essence: https://m.youtube.com/watch?v=rAZavOcEnLg

I like the COMPAQ branding added to it!


It all comes down to the definition of "faster". Standard testing is based on binary computations, the idea that there is a finite answer. Take a fire control computer on a ship. It has maybe 30 inputs, all essentially analogue dials. It combines them into a continuous analogue answer, a firing solution for the guns (elevation + azimuth). It doesn't do that "X times per second" or to a particular level of accuracy. The answer is always just there, constantly changing and available to whoever needs it whenever they ask for it, measurable to whatever level of precision you want to measure. If you measure the output every microsecond, then it is a computer that can generate an answer every microsecond. But that speaks more to the method of measurement than the speed of the machine.


It's true that we measure the speed and precision of analog "computers" differently from how we measure them for digital computers, but it does not therefore follow that analog "computers" are all infinitely fast and perfectly precise. Any analog system has a finite bandwidth; signals above some cutoff frequency are strongly attenuated and before long are indistinguishable from noise. And analog systems also introduce error, which digital computation often does not. When digital computation does introduce error, you can decrease the size of the error exponentially just by computing with more digits, and there is no equivalent approach in the analog world.

For mechanical naval fire control computers the cutoff frequency is on the order of 100 Hz and the error is on the order of 1%. You won't learn anything interesting by sampling them every microsecond that you wouldn't learn by sampling them every millisecond.


Basically anything that has to do with processing an analog signal. It's always faster to do that with analog electronics rather than using a ADC, doing the computation in the digital domain, and then getting the result back to the analog world with an ADC.

One example, if I need something that when two switches are triggered will turn on a light bulb (basically an AND gate) it's obviously faster doing that with an analog (mechanical) device, that is the two switches wired in series, than acquiring the signal with a microcontroller and outputting a signal to turn on the light bulb.

Thinking about the industrial world, there are cases where you have constraints about speed and real time that make sense to do signal processing with analog components rather than digital ones. And that was always the case before computers where invented, by the way (missile guidance systems were purely analog, as one example, you can do a lot of stuff!)


> Spanish Catalan architect Antoni Gaudí disliked drawings and prefered to explore some of his designs — such as the unfinished Church of Colònia Güell and the Sagrada Família — using scale models made of chains or weighted strings. It was long known that an optimal arch follows an inverted catenary curve, i.e., an upside-down hanging chain. Gaudí's upside-down physical models took him years to build but gave him more flexibility to explore organic designs, since every adjustment would immediately trigger the "physical recomputation" of optimal arches. He would turn the model upright by the way of a mirror placed underneath or by taking photographs.

http://dataphys.org/list/gaudis-hanging-chain-models/


The simple sundial calculates the time based on Sun's position relative to Earth


Cracking encryption is basically irrelevant as a quantum computing application. Post-quantum encryption algorithm development proceeds apace and the messaging is already "if you want this to still be encrypted 30 years from now, start using post-quantum encryption today." Anybody caught with their pants down the day quantum computers can actually crack 4096-bit RSA simply isn't serious about security.


“Anyone who doesn’t have weaponized anthrax isn’t serious about home defense.”

This forum gets more and more detached from reality every day.


It's definitely not true today: for example, there are no NIST standards (and I'm not sure about standards from other governments) for quantum-resistant key exchange. Several such systems have been developed, and NIST has even chosen one to standardize, but they aren't standardized or widely deployed yet.

But I expect that in 5-10 years, most security systems designed by competent professionals (up-to-date OS security services, TLS servers, SSH servers, VPN, firmware update systems etc) will have post-quantum crypto enabled by default. And I expect it will take longer than that to build a QC that can break classical crypto.

More likely it will play out like the SHA-1 break: all professional security engineers should have switched off SHA-1 (at least for unkeyed hashing) years before any collision was found, and users who apply security patches should therefore by mostly up to date, but I'm sure some are still using the older crypto.


Not this forum, but rather the US government:

“NSA intends that all NSS will be quantum-resistant by 2035, in accordance with the goal espoused in NSM-10.”

Source: https://media.defense.gov/2022/Sep/07/2003071836/-1/-1/0/CSI...


The list of things that have to be encrypted 30 years from now is very, very small. I doubt any (many?) people here have contact with any of it. I don't understand your analogy at all, sorry.


The message I sent to my girlfriend last night. In 30 years, when I am running for president, that email/text/signal message might come back to haunt me should anyone be able to decrypt the archived/encrypted copies held by state agencies.

Anything that is private today is private for a reason. That reason doesn't automatically disappear over time.


This is indistinguishable from hoarder logic. Such things straight up don't matter on the scale of decades. The US DOJ has a policy of automatic declassification after 25 years.


You do realize that because of #metoo, claims and evidence of people's actions 30, 40 years ago are being judged in the court of public opinion, if not in actual courts?

I don't think you've been paying attention to the news.

Also the US government isn't a great example. JFK was assassinated in 1963 and all records surrounding that still haven't been released.

The idea that people don't care about secrets across the span of decades is utterly wrong.


How does encryption impact #metoo? The person waking the accusation would have a decrypted version of the message and even if they didn't they could accuse without proof.


>> hoarder logic.

And the US intelligence community is the greatest data hoarder on the planet, rivaled only perhaps by the combined forces of facebook/google.


What about my crypto-currency? Imagine a quantum computer could crash all them crypto-markets and bring about an economic collapse.


Yeah, people who are serious will have probably switched to / hybridized with PQC before a cryptographically relevant quantum computer is built. Unless some state agency has a secret one. So the main relevance might just be forcing everyone to switch / hybridize.

At the same time, from history it seems almost certain that, if indeed a CRQC ever gets built, a significant number of users will not have secure PQC rolled out on day 0.


As someone who has spent a fair bit of time figuring out how to explain quantum computing to people I've come to the conclusion that there are only two possible ways for people to understand it:

(1) mathematically

(2) as a finite list of things quantum computers can and cannot do

and most people are not going to understand it mathematically, least of all money people who have to watch and evaluate 10 powerpoint pitches in a day or whatever. Without the math you cannot possibly explain how superposition and entanglement work, and even that explanation requires your audience already understand how classical computers work. So you are often reduced to saying "Here is what we think quantum computers will be able to do. The timeline for accomplishing this is at least a decade out. Here are some other things that people have said quantum computers can do which they definitely will not be able to do." But then you're purely relying on your audience believing you based on your credentials rather than following their own reasoning from a place of understanding. Someone else can come in with different credentials and say different things, motivated by money or simple ignorance mixed with hope, and now your audience is playing the credential evaluation game rather than the quantum computing capability evaluation game. Mix in low interest rates and a few people who have learned what to say to get attention, and you get the current state of things.

There is a quiet core of real quantum computing research happening, surrounded by a moat of noise and hype that is required to interface with investors and the public. My sincere hope is that this quiet core accomplishes real advances before the music stops.


The wrong explanation I like about QC works is to describe it as 2^n many computers doing the same computations on related data, with the caveat that you can only query one at random.

Sprinkle some hand waiving around how you can "average them all together" or have them "check the answer with each other" before querying them.


I was just thinking about this analogy to parallel computation. It works well enough, and gives a better intuition than a list of things a QC can and cannot do, as long as they can understand that these aren't regular computers and so reading the result has some restrictions which is why it only provide exponential speedup on some problems.


In my opinion, researchers are not honest when they do explain quantum computers mathematically. I have never seen an explanation with a narrative:

- Here are matrices, this is how we multiply them and get a resulting vector.

- Here are special kinds of matrices SU(N) which leave |\Psi|^2 invariant, and here is an example for multiplying.

- Quantum computers are just SU(N) matrix multiplication accelerators.

At this point, no hype can survive and neither funding.


I don't think that's quite true. The product state has exponential size so you'd also need some kind of tensor oracle that could manipulate an exponential quantity of information in linear time.


It's true that the product state is exponential in size. That's why the size of the quadratic matrix SU(N) matrix is N = 2^M, where M is the number of qubits. To my knowledge that's the only thing that is exponential there.


Hmm, that seems correct then. This is interesting, would you consider extending the explanation and posting it as an answer to this question? https://quantumcomputing.stackexchange.com/q/5459/4153


The question of the precision of number representation in SU(N) is an interesting one. It's only a different question :)


Two things can be true at the same time:

We know that an ideal quantum computer provides an exponential speed-up in commercially relevant applications. There is a significant amount of smoke and mirrors in the quantum space.

Unless there is some unknown fundamental reason why we cannot realize a good-enough quantum computer with sufficient knowledge and engineering, governments and business are well-advised to invest into R&D of quantum computers.


By this logic, the ancient Egyptian pharaohs should have dedicated a sizeable portion of their treasury to researching manned flight, since they knew it was possible and they knew it would have significant economic and military advantages.

Which is to say, just because something is possible in theory doesn't mean we have a clear idea how to get there, and if we're waiting on research breakthroughs, more money is unlikely to significantly speed up the process.


The US is spending around 1% of its national budget on science, and a minuscule part of that goes to quantum computing. That fraction of the pharaoh's treasury might have payed for one rather entertaining crackpot to build bird-like things, but it wouldn't put a dent in their other accomplishments.


But actually if you spend more money than can reasonably and effectively be spent on a given field, you run the risk of setting back the field by advancing charlatans.


That's an interesting comment. Have you seen this happen?


VC investment in crypto. The 1% (too generous?) of proposed usecases where crypto can actually solve practical problemsgot hardly any VC funding compared to the idiotic ideas like smart contracts.


Maybe cold fusion?[1] (probably not though)

[1] https://en.wikipedia.org/wiki/Cold_fusion


Psychedelics.


Yes.


If you're talking about Langley, they money was tragically wasted as the local bicycle mechanics achieved the result mostly by ignoring the theories that turned out to be wrong, and are still wrong, and have been proven to be wrong, and yet are still universally taught today, perhaps as a sort of right of passage, because the truth is either too complex or too simple to fit neatly into the understandability exclusion zone.


Sure, why shouldn't they have?


> We know an ideal quantum computer provides an exponential speed-up in commercially relevant applications

Do we know that though?


We don't. Here's a survey paper from quantum computing expert Scott Aaronson, posted to arXiv just a couple months ago: https://arxiv.org/pdf/2209.06930.pdf

From the abstract:

> I survey, for a general scientific audience, three decades of research into which sorts of problems admit exponential speedups via quantum computers -- from the classics (like the algorithms of Simon and Shor), to the breakthrough of Yamakawa and Zhandry from April 2022. [...] I make some skeptical remarks about widely-repeated claims of exponential quantum speedups for practical machine learning and optimization problems. Through many examples, I try to convey the "law of conservation of weirdness," according to which every problem admitting an exponential quantum speedup must have some unusual property to allow the amplitude to be concentrated on the unknown right answer(s).

I am very confused at how so many people are absolutely convinced that quantum computing is known to do, or already does, all sorts of things it is not known to do. There's a sister comment here saying exponentially superior quantum computers are available in AWS!

It's like an urban legend that circulates among software engineers.


You're wrong, we do know that, and the paper you cite doesn't refute that in any way.

Integer factorization and quantum simulation would be two examples of commercially highly relevant applications where an exponential speed-up applies.


Why would integer factorization be commercially relevant? Only practical use of that (as far as I'm aware) is breaking RSA. Hopefully people will just switch to known quantum secure algorithms. It'll be a nothing burger.


Integer factorization for codebreaking is a military or criminal application, not a commercial one. Someone will make money doing it, but it won't be contributing to society in the way we normally mean when talking about commerce. It would be like saying nukes have commercial applications.

"Quantum simulation" is too vague to speak to applicability in any domain. There is no proven or empirically demonstrated exponential speedup on any simulation problem with known commercial applications.


This paper claims quantum computer can integrate arbitrary non-linear differential equations with quantum advantage:

https://arxiv.org/pdf/2011.06571.pdf

It does seems too good to be true.


I found a layman's explanation at Quanta Magazine: https://www.quantamagazine.org/new-quantum-algorithms-finall...

The two caveats are

1. Only works for "mildly" nonlinear equations

2. Results are in quantum world and have to be translated back into normal deterministic world results, and this hasn't been figured out yet


Switching a few key libraries like OpenSSL to using quantum-resistant cryptography is WAY WAY easier than constructing a viable quantum computer (in secret) that is able to break 512 bit or 1024 bit key RSA.


Integer factorization is relevant only because integer factorization is currently considered to be computationally difficult and effectively impossible. If this ceases to be the case (effective QC hardware, or perhaps some math breakthrough), the effect is simply that integer factorization stops being a valid security or proof-of-work measure and thus it ceases to be used, not that breaking integer factorization becomes a viable commercial industry.


The only thing I can think of is somehow speeding up drug discovery or materials research by accelerating atomistic simulation. However that's so far from reality it's hard to tell whether it will actually be faster in practice. (Pretty much the first application people like Feynman had in mind too)


Well, those two small niches plus biological research are bottlenecks is pretty much any production process we have with real things.


Breaking RSA and ECC is commercially relevant.


Not in the long term, because as soon as quantum computers can practically break RSA, it would cease to be used and thus breaking RSA would become commercially irrelevant.

Currently the choice of replacements is taking its time e.g. https://www.nist.gov/news-events/news/2022/07/nist-announces... but if tonight we'd find out that people do have sufficiently powerful quantum computers, we'd just start using one of the candidates in a jiffy)


If you can break RSA once using say a $34 billion machine that is a one-shot operation, is it really commercially relevant?


With the number of qubits required to break RSA, there's no reason why general purpose computing wouldn't be possible.


Without assuming a spherical cow, that is well into the realm of science fiction-stuff we have no idea how to do. It assumes a degree of control over the variables that is far beyond anything we have access to.

As a parallel, on a purely theoretical level, we know how to construct a warp drive as well[1].

Solving a theoretical equation is one thing, replicating the prerequisite conditions experimentally is another entirely. Theoretically you can balance a perfect sphere upon another perfect sphere. In practice, you can't because setting up such an arrangement is practically impossible.

[1] https://en.wikipedia.org/wiki/Alcubierre_drive


> Breaking RSA and ECC is commercially relevant.

Yes, but we don't actually know that we can build such a device.


Yes, and you can test it yourself on already consumer available quantum hardware via AWS.


AWS Braket is only, at best, useful for research in quantum algorithms. It gives you access to classical simulations of QCs (which run you linear-time quantum algorithm in exponential real time), or access to some noisy real QCs (the ones whose best achievements so far are proving that 21 = 3 * 7 using a version of Shor's algorithm specialized for certain numerical properties of the number 21).


The amount of research funding is finite. Why spend it on something highly speculative when there are many other things it could be spent on which are not speculative?


It's finite but not explicitly conserved: science funding is around 1% of the US national budget and can change (a bit).

In practice you're right, though, usually new funding in one place gets taken from somewhere else. But if the world really wanted scientific progress we could afford to fund some more speculative projects.


> Two things can be true at the same time

Yes, but it is not possible to assert that both are true at the same time with full certainty.


"Problem is, a lot of CEOs in industry and the financial sector can’t tell a bra from a ket"

i do not know enough about quantum computing to judge whether this article is accurate or not, but it certainly is well written and entertaining


I highly recommend "Quantum computing for the very curious"[0] for an introduction to quantum mechanics. I went through it years ago and can still remember the main ideas thanks to the built-in spaced repetition.

[0] https://quantum.country


It is remarkable how everybody only seems to comment on "spaced repetition" rather than anything involving quantum computing when they discuss that site.

Regardless, memorizing a few facts won't help with reasoning about "is quantum computing even possible".


Maybe because it's pretty much the only essay that offers built-in spaced repetition?


I'm familiar with bras I think... what is a "ket"?



It seems I was NOT familiar with what a bra is...


It's the hermitian conjugate of the corresponding ket.


And a monad is an endofunctor of a monoid


With that at least I can be nearly certain we are not discussing an alternative brassiere.


It's physicist notation for inner products: https://en.wikipedia.org/wiki/Bra%E2%80%93ket_notation


I agree that many in the financial sector do not know much about quantum computing, but it also does not seem that some in the quantum that do not understand the details of the problems that the financial sector has either. For instance, those in the capital markets/trading floors in Europe and North America do significant risk calculations with hundreds or a thousand variables on up to 100,000 positions with a certain confidence. Many of these risk calculations need to be reported nightly to regulators. These financial companies are paying many millions per year to AWS, Azure, or Google to do these calculations on classical computers. However, many parts of these calculations could be done with quantum computers relatively instantly, given enough qubits. I realize that the technology and reliability is not there today, but hopefully it will come soon. I would not be surprised if by 2035 using quantum computers to do many variable risk calculations would become (almost) mandatory for major European and North American financial companies.


What makes you say this? Which particular quantum algorithm do you think gives an exponential speedup for multi-variate risk predictions? And why do you think there is any chance in hell that a quantum computer would exist by 2035 that could even store an input of the size you're talking about, nevermind have spare qubits to actually process it?


On a semi-related note, bra-kets are just magical. Programmers who deride mathematical notations, and claim we should abolish equations in favor of pseudocode, should try to get a glimpse of their power.


The "next big thing" isn't coming along well.

* 2017 - the year of 3D TV.

* 2019 - the year of VR

* 2021 - the year of the Metaverse.

All duds, or no more than niche products. Related duds include quantum computing, fusion power, and self-driving cars. (There are self-driving cars that work, from Waymo and Cruise, but they're a long way from being cost-effective.)

On the other hand, there's lots of work to be done deploying the stuff that works. Solar. Wind. Batteries. Electric cars. Desalination plants. Automated manufacturing. Electrical transmission infrastructure to get power to where it's needed. All are profitable. None has either huge margins or monopolization potential. This discourages the Silicon Valley funding model.


>> On the other hand, there's lots of work to be done deploying the stuff that works. Solar. Wind. Batteries. Electric cars. Desalination plants. Automated manufacturing. Electrical transmission infrastructure to get power to where it's needed. All are profitable. None has either huge margins or monopolization potential. This discourages the Silicon Valley funding model.

Honestly, this should be in all caps, and heard by everyone. I am regularly disturbed by this fact.

I am lead/senior software engineer working with frontend development. I could easily do embedded software engineering, automation or any other more pressing field for humanity. But economics are simply not there to justify. I would be getting a %60 pay-cut if I chose to work on anything that actually matters.


Yeah it's insane how shit gets funded and obvious non-brainers do not. However, working on something that matters, despite the pay-cut, brings its own rewards. But it is not easy to find such work.


Let's add to the list crypto currencies and all the related stuff, such as NFT, that already made a lot of people loose a lot of money.


Sadly crypto currencies aren't dead yet like 3D television and is causing massive CO2 emissions.


>* 2019 - the year of VR

How is 2019 the year of VR? Look at Google trends for "Oculus Quest". You can see that it continues to be growing. 2019 was not some kind of peak.


Did we even have a quantum summer? The hype is nowhere near what we've seen for (say) blockchain, AI or self-driving cars.


The hype is smaller, though I’ve seen my share of grifters in the market. It’s also not a vote of confidence to see IBM behind much of the research.


The problem with quantum computers is that every time someone thinks of a good application, there's a scientist dropping in with an "ackchyually that's not how any of this works". At this point I'm convinced that RNG is the only thing quantum effects are actually usable for.


Already in winter mentality :)


That hype was back in the 00's. This one could be like AI, where it had a huge hype cycle in the 80s, and exploded over the last decade. Or not.


I think that is a good thing. There are usecases being found for quantum computers, but nothing that can’t be done with normal computers yet.


There was as far as marketing. My family members stopped asking me about ads they saw on TV for IBM's quantum computers.


I've started to adopt a hedging strategy (i.e. short plays) with IBM marketing...seems like everything they touch dies (horrendously).

ETA..in case anyone is interested, the author of the article is a theoretical physicist, which lends more credibility to the article.

https://en.wikipedia.org/wiki/Sabine_Hossenfelder


Oh yes, we did. Remember Google announcing quantum supremacy? At the very least a spring.


Unfortunately I work in cryptography, and breaking our algorithms is one of the tasks that requires the least number of qubits. And that doesn't depend on having a desktop quantum computer either, but only that a single state actor anywhere has a sufficiently large quantum computer.

I'm a bit bitter that physics handed us this magical tool, and the best we managed so far is using it to invalidate decades of security research.


> Unfortunately I work in cryptography, and breaking our algorithms is one of the tasks that requires the least number of qubits

I don't think that is true (or maybe I'm underestimating how many qbits other uses of QCs take). Estimates are still in the many millions: https://cacm.acm.org/news/237303-how-quantum-computer-could-...


You only need a few thousand error-free qubits to implement Shor's algorithm for 256-bit Elliptic Curve Discrete Log, that will for instance break nearly all crypto. The "millions" is trying to account for the several orders of magnitude error correcting overhead.


Sure, I just don't think error-free qubits are a thing (or will be in the future). I don't think anyone seriously expects quantum computing to work without error correction.


The difficulty of adding qubits increases super-linearly with the number of qubits (especially because of communication delay vs time to decoherence) , so "only" a few thousand is already very optimistic. Worse, the idea of "error-free qubits" is essentially like cold fusion - you can say the words and we understand what you mean by them, but they don't describe anything that can exist in practice.


> The difficulty of adding qubits increases super-linearly with the number of qubits

Is that true? Hardware from the likes of IBM and IonQ has already gone from < 10 to >= 20 “algorithmic qubits” [1] in the space of a few year.

[1] https://ionq.com/quantum-systems/aria


Error-free qubits are a fantasy, error correction is a must. I'm not particularly worried about quantum computers breaking crypto anytime soon.


That's from three years ago, and for error-corrected RSA breaking. ECC has keys an order of magnitude smaller, and minimizing the number of quits to run Shor's is a hot area.

And compared to other uses (quantum AI anyone?), it's surprisingly compact.


It's rather easy to produce qubits and place them in a box. The hard part is to make them interact with each other in a controlled fashion thus, adding one qubit to a large system is substantially harder than adding it to a small one.

The only true benchmark is a factorisation of numbers. Number 21 has been factorised with nudging. Let's wait for a number 45 in the coming next decade.


How many qubits vs larger keys?


I'm afraid the number of qubits doesn't grow fast enough. Here's[1] a tongue in cheek "Post-Quantum RSA" with 2 Terabit keys.

[1] https://cr.yp.to/papers/pqrsa-20170419.pdf


> Unfortunately, I have to inform you that Moore’s law isn’t a law of nature. It worked for conventional computers because those could be miniaturized. However, you can’t miniaturize ions or the Compton wavelength of electrons. They’re already as small as it gets.

I think this is a mischaracterization. Moore's law is the result of many individual multiplicative advances that stack in each other (many of them allowing further miniaturization, but alternatively can also go into the direction of larger chips: chiplets, 450mm wafers, wafer-scale chips).

There isn't a reason this stacking couldn't also be possible for Quantum Computers. Indeed, the number of qubits seems to grow exponentially [1]: 1 -> 2 -> 5 -> 17 -> 49 -> 76 -> 127 -> 216 -> thousands. If anything, the iterative miniaturization of classic circuits made the first steps more approachable compared to quantum computers that had to start on the atomic level.

However, the existence of more and more stackable advances is indeed no law of nature that some authors seem to assume [2].

[1] https://en.wikipedia.org/wiki/List_of_quantum_processors [2] https://en.wikipedia.org/wiki/Accelerating_change#Kurzweil.2...


Quantum Computers are a great research field. The physics and engineering problems are exciting.

It's a case for research money. I wouldn't invest in it but billionaires could spend their money worse.


Yeah, they could buy Twitter :-)


One thing to note is while there are not a lot of quantum algorithms (though wikipedia's list is very incomplete!), some of them are massively useful. Grover's algorithm provides a quadratic improvement in black box search (which is a subroutine in many many many many algorithms). Solving a system of sparse linear equations can be solved log(n) time on a quantum computer, compared to n on a classical computer. This is the single most important problem in scientific computing.


> Solving a system of sparse linear equations ... is the single most important problem in scientific computing.

I think this is arguable. To be sure, a lot of linear-system solving goes on in science, but you cannot conclude from this that it is the "single most important problem", it's just the one that we happen to know how to solve, and so that's where the compute power goes. It's like looking for your keys under the street light because it's easier to see rather than because it's where you lost your keys. Protein folding and the Navier-Stokes equation are arguably "more important", we just have no idea how to solve those problems.


The navier stokes equation is quite literally solved using the diagonalization of large matrixes!


As an approximation, because we don't know how to find exact solutions.



> You’re better off reading ancient Greek than studying a ‘technical’ subject that eventually involves bringing a public school kid like me a steak.

Love it!


> director of research at Bank of America said that quantum computing will be “bigger than fire”. The only way in I can see this coming true is that it’ll produce more carbon emissions.

I laugh so hard on this. But it's true.


Broad strokes this article raises some ok points. She’s wrong about SC qubit coherence times and she’s wrong about refrigeration being any kind of bottleneck. You could buy a dilution fridge from bluefors off the shelf. The microwave hardware and optics and all this stuff costs an order of magnitude more. Just kind of a low effort article. If you want to beat on SC qubits, bring up cosmic rays. She’d know that if she passed this by an expert, which I guess she’s too smart to do.

Either quantum computers pan out over the time investors stay interested or they don’t. It’s not unlike any other technology. As far as physicists feeling embarrassment, why should we? One, we’re largely incapable of feeling it, and two, we tried and that’s cool.

You have these VCs who have a very narrow view of how the future should be (chiefly being that which makes them richer) so you can only pitch so many technologies that fit that mold. To be blunt, find me a better technology that could potentially push the boundaries of human capability or understanding or whatever that VCs will invest in. Beats the hell out of VR. Is it cooler than shooting stuff into orbit? Seems on par to me.


Cooling is definitely a problem is your aim is to cool down the millions of physical qubits required to get the thousands of logical qubits needed to do an actually useful quantum computation. No one has scaled up their qubit technology to achieve millions of physical qubits, and this is going to be a very big computer with tons of thermal mass that needs to be cooled down sub-kelvin. That is a substantial engineering challenge.


Qubits are getting smaller at the same time their coherence times are getting longer. The millions of qubits figure assumes pretty crappy coherence times. I don’t see any reason why you wouldn’t fit a future QPU on a 4 inch wafer or some kind of stack of them.

Finally, again, dil fridges are a solved problem. Not only are they solved but there’s a ton of room for improvement. They’re very inefficient. You can just make bigger ones with more dil units.

That leaves the wiring, but you remove a lot of that with cryosilicon computers and multiplexers.


I would also say that Sabine is wrong on a number of points, like coherence times, but I thought that the problems with cooling were correct. Since I don't know much about cooling, would you be happy to elaborate?


See my other response


The video mostly seems reasonable, but the economic arguments toward the end seem off base to me.

Hossenfelder describes a situation in which universities rent equipment from large companies, but fits this into a worldview in which quantum computing as an industry will not be commercially viable.

But isn't this exactly what happened with the internet and other new forms of large scale computation? Initially demand came largely from academia (or government via defense), companies competed on cost and usability. After a few years or decades of competition and scaling, the technology became so commercially useful that it's now ubiquitous. Why won't that happen with quantum computers, why would that be a bad thing, and why shouldn't academics want to be working on that?


Classical computers were solving all kinds of useful problems since the beginning.

Quantum computers on the other hand, don't appear to be good for anything now and in foreseeable future.


> They just produced a random distribution that would take a really long time to calculate by any other means. It’s like this this guy stapling 5 M&M. That’s a world record, hurray, but what are you going to do with it?

And this is why I keep coming back to Sabine Hossenfelder.


Not the first article on HN to make that point [0]. Feels a bit like whistleblowers starting to come out.

[0] https://news.ycombinator.com/item?id=32722374


The main application of quantum computing is Cunningham’s Law. Any time someone shows something is faster on a quantum computer, then you know it’s worth spending some time proving that actually the classical algorithms could have been faster in the first place.


Can somebody name one practical application of QC? Perhaps not right now, but within 3-5 years? It can’t be prime number factorization in encryption, since people will just switch (and already have) to elliptic curves.


Seems like true neuromorphoric computing has more of future than quantum computing. We assume processing power and iterations go together, but what happens when we get better at self-learning software like our wetware?


A sfar as we know, the problem of solving the Schrodinger equation would be exponentially faster on a quantum computer than the fastest possible non-quantum computer. Given that "neuromorphic" computers are classical computers, we don't know of any way to achieve a similar speedup for this problem, so we can at least say that there are applications for which we need a QC if we want to solve them. Of course, it's possible that those are far fewer than "neuromorphic" computers, so it may make more sense to invest more in the latter.

Note that I'm putting neuromorphic in quotes because it's mostly a marketing term, the resemblance between memirstors and neurons is at best symbolic.


> Quantum computers are promising technology, yes, but the same can be said about nuclear fusion and look how that worked out.

I've been following Sabine for quite a while. She's really working on her snark.


> weather is a non-linear system whereas quantum mechanics is a linear theory

This lady have no idea what she is talking about.


I believe that she was using the term "linear" in the computational/signal processing sense, not that of computing.

In electronics, in a linear system, you can decompose a complex waveform and analyze its response to each frequency discretely, and when you recompose them, you get a correct answer.

In a non-linear system, such as a mixer, no such analysis is possible, you have to consider all of the frequencies, and their levels at the same time.

Also consider that most algorithms are founded on a deterministic computational method.


I believe the point is that quantum computers allow you to manipulate a linear combination of bits, whereas simulating the weather would require non-linear combinations. It's unclear how a QC would help simulate something like that better than a classical computer.


Quantum algorithms are nonlinear. If someone is saying "quantum mechanics is a linear theory" as a counterargument to anything QC related. Then that person has no idea what they are talking about. Because they likely never cared to learn event the basics of QC.


I don't think it's quite as simple as that:

"New Quantum Algorithms Finally Crack Nonlinear Equations. Two teams found different ways for quantum computers to process nonlinear systems by first disguising them as linear ones."

https://www.quantamagazine.org/new-quantum-algorithms-finall...


Quantum mechanics is a linear theory. That's not an erroneous statement.

Quantum computing and quantum mechanics, not the same thing.


It constantly amazes me how much attention Hossenfelder, a simple research fellow, manages to attract.


Intelligently communicating complex topics to the masses is a valuable skill in itself. She gains a lot of credibility with her humble attitude and willingness to admit what she does not know.


Sabine isn't very accurate though and pushes her own fringe ideas while criticising everybody else.


Even if that's the case, she seems able to differentiate opinion from fact.


She opines on a lot of things she has very little knowledge about and her writing style is terrible. But it’s crack for HN readers who mistake skepticisms and snark for insight.


Do you have an educational youtube channel? Will definitely take a look if you do.


PBS Space Time seems pretty good. He gives good explanations and gives a balanced perspective to both sides when there's debate.


Hossenfelder has been in Physics for more than 25 years and is a seasoned researcher and communicator. Her “academic rank” shouldn’t and in fact doesn’t have any bearing on her impact.


Can slow neutrinos, in principle, be used to build qubits?


I took Computers in high school, some might even say I’m a computer guy, and this article is absolutely correct. Believe me.


That was a great fucking read


Because there was a spring?


This reads like some thing like IBM® Watson.

There was indeed an AI boom at the moment, just not Watson.


is there a tl;dr? This is a very long post that doesn't seem to want to get to point by the time you made it halfway through.


I'm sick and tired of this pop science. Write research papers don't clot my YT feed.


Guys stop saying there is winter this and winter that, it's all part of the Gartner hype cycle:

1. it all started with a technology trigger (much to the like of early AI development by Turing and McCarthy)

2. then we reach the Peak of Inflated Expectations (trying to solve real world hard problems like Travelling Salesman)

3. and swamp through the Trough of Disillusionment (Death of LISP machine and overall major AI projects halts)

4. until the Slope of Enlightenment (accidental discovery of using GPU to accelerate AI computation)

5. and finally reaching the Plateau of Productivity (developing Tensorflow, PyTorch, and the overall AI democratization though the use of DL and AutoML).

We have just barely in between the stage of Peak of Inflated Expectations and going to Trough of Disillusionment for Quantum Computing and very much likely to stay for a while. Don't you worry child, It's all part of the cycle.


Guys, stop saying Gartner hype cycle this and Gartner hype cycle that. Not every failing overhyped idea is secretly a future winner. Sometimes, there's actually a 3D TV winter.


#4. will be more like: "someone uses some new advance from another field to actually make it work... then they get crushed in court by patent trolls who acquired all the failed IP."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: