As a person who works in Quantum Computing, I can concur that there's hype in the field; one of the professors in my university has a quantum machine learning (TM) startup that seems to be performing well, even though most of the faculty and grad students can tell you that it's bullshit.
However. This paragraph straight up displays a fundamental lack of understanding by Hossenfelder:
> Last time I looked, no one had any idea how to do a weather forecast on a quantum computer. It’s not just that no one has done it, no one knows if it’s even possible, because weather is a non-linear system whereas quantum mechanics is a linear theory.
Unitary evolution generated by the Schrödinger equation is a linear map on _probability amplitudes_, just like how classical (probabilistic) computing performs linear operations on _probability distributions_. The commonly used quantum circuit model is a superset of classical logic gates and can accomplish anything a probabilistic classical computer can, so if anything is possible in a classical computing scheme, it's also possible in the quantum circuit scheme.
I don't have much sympathy for her since this is not the first time Hossenfelder has displayed a lack of understanding, recently she has published a paper criticizing another one [1], now replaced with a much shorter text due to being told [2] by the authors of the original paper.
Yeah I get it, it's dumb when the president of BofA is talking about how QC is "the next big thing", I know it's not coming Soon^TM, but saying "we will never have a quantum computer because the current ones suck" has the same energy as "the world doesn't need more than 5 computers" imo.
> Unitary evolution generated by the Schrödinger equation is a linear map on _probability amplitudes_, just like how classical (probabilistic) computing performs linear operations on _probability distributions_. The commonly used quantum circuit model is a superset of classical logic gates and can accomplish anything a probabilistic classical computer can, so if anything is possible in a classical computing scheme, it's also possible in the quantum circuit scheme.
To predict weather on a computer, we need to run large CFD simulations. When we do this on a classical computer, this involves a discretization of a system of PDEs with millions or billions of degrees of freedom, requiring 4 or 8 bits per floating point number. It may be possible to do the same CFD simulations on a quantum computer, but this is several constrained by the small number of qubits currently available on quantum computers. And clearly, even if you could run the same algorithm, presumably the point of using a quantum computer would be reap the "quantum advantage" in order to do something algorithmically superior to what's possible on a classical computer.
I think this is a pretty small point to get hung up on. The rest of her article is perfectly reasonable.
It really is not because this is a semi-common misconception and to me signals basic unfamiliarity with the subject. Schrodinger's eqn being linear has nothing to do with implementing non-linear functions. There's no question about how you would implement the said logic in a quantum computer - you can just do what the classical implementation does. Yeah we don't have nowhere enough qubits and it would be a gross waste of resources, but we _do_ know how to do it. Saying they don't know how to do it is a false statement at best. Again, I'm not saying we're going to be solving weather models with a quantum computer soon - even though I know folks are working on QC algorithms for (nonlinear) PDE solutions.
> Schrodinger's eqn being linear has nothing to do with implementing non-linear functions.
Indeed. And it's possible SH is confused by this, since she had another video about quantum chaos in asteroids where similar observation applied and she didn't address that. However...
> There's no question about how you would implement the said logic in a quantum computer - you can just do what the classical implementation does. Yeah we don't have nowhere enough qubits and it would be a gross waste of resources, but we _do_ know how to do it. Saying they don't know how to do it is a false statement at best.
Here you're being a little uncharitable. Indeed theoretically one could make the quantum computer simulate the classical computer with the non-linear weather algorithm. But the interesting point Mrs. Hossenfelder may be making here is there is no known way to make quantum computers calculate/simulate the weather evolution in a "quantum computer way", that is, not simulating discrete-state classical computer which would be wasteful and most probably not with advantage, but realizing the differential equation evolution in analog mode, using the quantum superposition capabilities. That is not known to be possible. Quantum computer may be an analog computer (continuous evolution of state), but it is not clear how to use it to integrate interesting sets of differential equations like weather models.
This paper has a very interesting claim, that they can integrate arbitrary non-linear differential equations on quantum computer, with advantage. If their analysis is correct, this makes the case for weather prediction on quantum computers much stronger.
Yeah the "superset" perspective seems like you could end up cheating, implementing the equivalent of classical logic and probably much more slowly than a classical computer could do it.
I googled around and found this research though, which does propose using a nonlinear quantum system: https://arxiv.org/abs/2210.17460. It doesn't really claim the issue is solved.
> I think this is a pretty small point to get hung up on. The rest of her article is perfectly reasonable.
The above isn't the only place that betrays her lack of understanding, though.
For instance, she confidently writes "Ion traps are used for example by IonQ and Honeywell. They must “only” be cooled to a few Kelvin above absolute zero," but this is just wrong; trapped-ion qubits do not, a priori, require cryogenic cooling. Yes, lowering the temperature can be useful for incidental reasons, as it improves the vacuum quality and reduces some technical excess noise sources, but this is simply an engineering choice. Many of the high-profile results in trapped-ion quantum information processing were in fact achieved in room-temperature systems. And even if one does opt for cryogenic cooling, the ~tens of Kelvin regime of interest here is incomparably easier to reach than the tens of milli-Kelvin required for superconducting qubits and other solid-state spin platforms (where those elaborate dilution refrigerator "chandeliers" are actually required to keep the qubits intact). In fact, in ratiometric terms, the temperatures of interest are actually closer to room temperature than to that millikelvin regime!
Like many physicists, I'd naturally be inclined to agree with Sabine Hossenfelder as far as her distaste of marketing hype is concerned, but in making authoritative-sounding statements without having the knowledge to back them up, and misrepresenting what one would hope she knows are the actual scientific facts in the service of a punchy script, she is hardly doing any better than those private-sector hype evangelists she ridicules. Beware of Gell-Mann Amnesia…
> Yeah I get it, it's dumb when the president of BofA is talking about how QC is "the next big thing", I know it's not coming Soon^TM, but saying "we will never have a quantum computer because the current ones suck" has the same energy as "the world doesn't need more than 5 computers" imo.
From the outside QC looks looks less like traditional computing (as you're suggesting) and more like cold fusion. There are plenty of hopeful stories and investments but it's hard to tell if it'll ever happen in a meaningful way.
However. This paragraph straight up displays a fundamental lack of understanding by Hossenfelder:
> Last time I looked, no one had any idea how to do a weather forecast on a quantum computer. It’s not just that no one has done it, no one knows if it’s even possible, because weather is a non-linear system whereas quantum mechanics is a linear theory.
Unitary evolution generated by the Schrödinger equation is a linear map on _probability amplitudes_, just like how classical (probabilistic) computing performs linear operations on _probability distributions_. The commonly used quantum circuit model is a superset of classical logic gates and can accomplish anything a probabilistic classical computer can, so if anything is possible in a classical computing scheme, it's also possible in the quantum circuit scheme.
I don't have much sympathy for her since this is not the first time Hossenfelder has displayed a lack of understanding, recently she has published a paper criticizing another one [1], now replaced with a much shorter text due to being told [2] by the authors of the original paper.
Yeah I get it, it's dumb when the president of BofA is talking about how QC is "the next big thing", I know it's not coming Soon^TM, but saying "we will never have a quantum computer because the current ones suck" has the same energy as "the world doesn't need more than 5 computers" imo.