The sceptic arguments I read focus on accidental decoherence during the computation. If that goes well, and classical computers only "catch up" during measurement, wouldn't this mean that the time cost of measurement has to scale super-polynomially? Is that plausible?
I got tired of speaking on behalf of a vague memory of Stephen Wolfram speaking that I just looked up the transcript.
6 SW: Yeah, I think… I think it’s not going to be true[that QC gives a speedup], that’s my guess. I think what’s going to happen is, if you take short algorithm for factoring, which is primarily a quantum Fourier transform, that Fourier transform is done beautifully quickly because there are all these threads that are running in parallel. The problem is, every thread is somewhere in a different place in branchial space, that thread, that us observers, we have to corral all those things back together again in order to tell what actually happened, and that’s… So there’s a… Usually in quantum computing one just says, “and then there’s a measurement.”
1:26:49.0 SW: Now, in actuality, when you have an actual device, you have all kinds of issues in making that measurement, all kinds of… How quickly does it decohere, all these kinds of things. There are all these kinds of very practical experimental features, and I think people have generally said, given the formalism of quantum mechanics, it’s like, well, all this quantum stuff happens and then boom, we do a measurement, and the boom, we do a measurement is actually pretty difficult in practice with actual experiments, that people have said, but if we do these experiments well enough, it will become the mathematical idealization that von Neumann and others made about how measurement works, and I don’t think that’s going to be true. I mean, we’re not sure yet, but it seems likely that there will be no way to do… To sort of, if you’re honest about how the measurement works, the measurement takes effort.
That reads like Wolfram is generally pretty ignorant of the current state of quantum measurements. We can do continuous measurements now that barely perturb the system (Google “quantum non demolition” and “weak measurement”). Also while measurements are difficult to explain in QM (incompatible with Schrodinger equation) our theory of them is quite rich and complete now. This just seems rambling and imprecise from Wolfram. We can actually do the idealised measurement, which is why the standard quantum limit is such a big deal in gravitational wave detectors
EDIT: I should actually qualify the above, in order to surpass the standard quantum limit we will turn GW detectors into QND (quantum non demolition) detectors using techniques such as frequency dependent squeezing (see Kimble 2000) which is an ideal measurement
Neat. I have no trouble believing Stephen Wolfram could be out of touch with the current state of QM given he's been busy with his physics project and what not. And he's a terrible speaker. That podcast was the longest in Mindscape history due to that.
I think part of his view is also partially based on the Wolfram Project multiway causal graph framework and its implications on the measurement problem. It's an interesting podcast if nothing else. Not very comprehensible, but interesting nonetheless.
Yeah. Frankly I don't know much at all about Wolfram's work, other than that Mathematica is the best maths software in the world. As far as I have gathered, his work is so strange and different that it has very little contact with most physicists. I work in quantum measurement and non-linear quantum optics and no one discusses him, either on the experimental or theoretical side. He seems to be a smart maverick with a bunch of money. I don't know if anything will come of it in the end
Without having a first clue about the mathematical details, I can tell you it's most related to the work out of Sean Carrol et al trying to derive GR from QM, with spacetime being an emergent property of networks of quantum entanglement. Wolfram's idea is even more stripped down to hypergraphs of nodes with nothing but identity, and some number of neighbour nodes. Then they're exploring the space of possible various update rules for these graphs(he calls this rulial space). He claims that it should be possible to derive "all of physics" from this model, including the Schrodinger equation. He says the resulting emergent QM is most similar to Many Worlds, but interestingly in their model it seems like branches in fact merge together again(eventually, though it might take the entire lifespan of the universe). Carrol seems to think that starting with the Schrödinger equation is cleaner and more austere(of course he's rather biased), and I tend to agree.
My gut reaction to his mention of a rulial space is to be reminded of the Calabi-Yao manifold situation in string theory. If the space is even close to similar in size to that parameter space, that would be a theoretical nightmare.
I wanted to add that indeed he does have a lot of money, and he has hired some very talented postdocs and PhDs with some of it. I have no doubt we'll see a lot of interesting new theory out of his group at least. And that's never a bad thing, even if the theory doesn't pan out.
This is the typical stephen wolfram thing where he read feynman's proposal for quantum computing and thought about it by himself since then, and then, I would guess, figured that the experimentalists haven't had much to contribute.