Hacker News new | past | comments | ask | show | jobs | submit login

This seems like such a bad idea, the equivalent of catapulting ever greater rocks on the moon to test Newton's laws.



Funny thing is that we have been running other experiments you would find just as silly: test gravitational attraction between human-scale objects with ridiculously expensive and sensitive equipment, test whether there is a difference between gravitational and inertial mass, interfere neutrons(?) after they pass through slightly different gravitational fields on Earth. My favorite is measuring the dipole moment of the electron (like, duh, of course the electron does not have a dipole moment).

All of these tests are silly if you believe the theories of the day. The vast majority of such tests in history have simply confirmed the theory of the day. And also, there are a small handful of them that have spurred the most amazing intellectual explosion in the history of this planet (Special/General relativity and Quantum theory).

Also, the investment permits the contractors for the experiment to create an industrial base crucial for applications in the general economy.


The standard model of particle physics does predict the electron should have an EDM.

https://en.wikipedia.org/wiki/Electron_electric_dipole_momen...

However, a nonzero electron EDM indicates P and CP breaking; in the standard model it’s generated through loop effects from the CP violation in the CKM matrix, making it very very small (1e-38 e•cm).

The experimental results are that it’s zero with a precision of ~1e-28 e•cm. So, there’s TEN orders of magnitude between experiment and theory—-that makes it not a particularly silly thing to measure, but a particularly appealing experimental target: lots of models of new physics predict something substantially larger than 1e-38, and the small value in the standard model makes a nonzero measurement an obvious signal.

Contrast the muon g-2 measurements, for example, where a new lattice QCD calculation of standard-model effects claims to adjust the prediction in the 10th decimal, reducing the tension to 1.5 sigma. If an electron EDM is found anywhere in the next… 8 or 9 orders of magnitude, it’d be an inarguable sign of new physics.


That is exactly the point I am making with my (facetious) comment. We measure it because a mismatch from the very much exected result would be monumentally important, even if improbable.


This is like spending $1B to pile up dirt to test the size of the biggest pile of dirt that can be piled with $1B. How can anyone know for sure that the laws of physics don't change when the expense of the pile crosses the $800M threshold?

Obviously, the laws of physics do not have a dirt pile expense parameter. Just because some non-absurd ideas have been called absurd does not mean that no ideas are absurd. The idea that quantum mechanics stops working when too many particles get together is likewise absurd, because just like there's no way for a particle of dirt to "tell" how much money was spent piling it up, an atom does not "know" how many other atoms are in the macroscopic crystal it lives in.


The laws of physics have a lot of "dirtpile scale" parameters. When spacetime curvature (presence of mass) is measurable, Newtonian mechanics stops working. When the action integral is comparable to the Plank constant, classical probabilities do not work. When the Reynolds number goes beyond a threshold, laminar models of liquids do not work.

You are factually wrong about your quantum mechanics claims. (1) The idea that quantum mechanics stops working at some mesoscopic scale is extremely popular among (respected) sceptics of quantum computing. (2) The most well known example of a quantum thought experiment, the Schrödinger cat, is about quantum mechanics (seemingly) not working at macroscopic scale (only last year was there a good theory explanation of the paradox https://pirsa.org/20010099 ) (3) Decoherence is the thing causing fights between proponents of different formalizations of Quantum mechanics and this would be a very stringent test of that process. There are plenty more ideas that would get tested with this experiment.

Also, I think you misunderstood my examples. None of the examples I gave in last post were considered absurd, rather all of them were fine measurements that were necessary in order to believe the theory when going into a new parameter regime. With your attitude we would have never found general relativity (of course throwing a bigger rock on the moon will follow Newton's law) or quasi crystals (of course only regular repetition can make a crystal) or prions (of course proteins do not reproduce).

If you are arguing that this money right now would be better spent on global warming mitigation, then I would probably agree. But that is a relative, not an absolute statement like yours.


>The most well known example of a quantum thought experiment, the Schrödinger cat, is about quantum mechanics (seemingly) not working at macroscopic scale (only last year was there a good theory explanation of the paradox https://pirsa.org/20010099 )

It's like I'm reading a comment from a strange alternate universe where decoherence was never discovered and quantum mechanics hadn't progressed past the 1930s philosophically. How can you know about so many proposed experiments without also knowing why "quantum mechanics stops working at macroscopic scales" is absurd? Is there really a group of scientists that think that? Can we find some missionaries to translate their native language and reach them?


I gave you a lecture from last year from Scott Aaronson, one of the most respected Quantum Information theorist in the world. Are you really claiming that he is unaware of the philosophical progress since the 30s? I also happen to work in this scientific field professionally, although I am not a luminary like Aaronson. I do consider these questions not yet well answered, so this is a "yes" to your rather rude question about whether scientists are interested in this.

Also, I am truly confused, what do you mean by "decoherence" being discovered? Being able to perform a partial trace and use a density matrix (and call it "decoherence"), does not really help with answering these questions.


The explanation for Schrodinger's cat is that in any practical box, you will end up as a part of the cat's state almost instantly. There will be one state where you're standing outside of a box with a living cat, and another where you're standing outside of a box with a dead cat. It's up to you to assign metaphysical status to those states, but the dynamics are clearly predicted.

I think you may be misapprehending the meaning of Aaronson's talk. He is saying that proving that a specific cat was in a superposition would be just as hard as changing its state from dead to alive, but that does not preclude the possibility of proving that several cats are in a superposition by observing the statistical properties of an ensemble of measurements. If that talk explained why Schrodinger's cat doesn't happen in our daily lives, it would have to address why we can't detect cat superposition probabilistically, like the way we detect particle superposition in most experiments (the interference pattern of the dual comb experiment only appears when there are a lot of measurements).


You are using vague ill defined words that just end up confounding topics in a meaningless way. For one, Schrodinger's cat premise has absolutely nothing to do with superpositions of multiple cats. There is no such thing as superpositions of multiple objects, especially if they are distinguishable like cats. Superposition are over different possible states of a single system not over multiple copies of a system. You can not interfere two cats together. The most charitable interpretation of what you are saying is some abuse of the language used to describe indistinguishable particles or some statement in second quantization, but that also has nothing to do with Schrödinger's cat.


>There is no such thing as superpositions of multiple objects, especially if they are distinguishable like cats.

Several superpositions of several cats makes an ensemble of experiments over which it would not be as difficult to detect entanglement as it would be in the case of one cat, which is the case that Aaronson describes.


Just write the ket describing that state and specify over what Hilbert state it is. At best, "superposition" does not mean what you are using that word for. It just sounds like nonsense right now.


1/sqrt(2) |alive> + 1/sqrt(2) |dead>

Take several of these and you can collect data across several expeirments that shows they're in a superposition, but if you only have one, Aaronson's argument applies.


More experiments are always worthwhile, but from a practical perspective funding is a zero-sum game with a small number of finite pots. The question being raised isn't whether this proposal is worthwhile "in general", but instead whether it's a good use of finite finding vs other potential science.

Given the immense cost and the lack of serious reasons to expect otherwise, I think it's a pretty reasonable question to ask.


I've heard it expressed numerous times that the zero-sum-game argument is a fallacy. There are a lot of foundations and donors that pick among projects and might just hold on to their money if a suitable project to fund is not available.


Spending $1B to build an unmotivated device is not a suitable project. There is no reason to build this thing, not a single one has been suggested.


I can not comment on the amount of money, but you do get that the question to be tested by this proposal is the main unsolved question that bothered Schroedinger, Bohr, Einstein, and others, right?


The $1B pricetag came from the linked article. The thing about the main unsolved question is simply false - the process by which classical mechanics emerges from quantum mechanics is already well-understood and partly taught in undergraduate courses.


Yes, I have even been an instructor for some of these classes. No, knowing how to take a limit / apply steepest descent does not tell you whether your model of reality is correct, it just lets you convert your model (quantum) to a more restricted model (classical). But you do not know if your intial model is not itself the limit of another model.


Converting the quantum model into a classical model as the limit is taken of greater distances is an explanation of why we don't see quantum mechanics in our daily lives. It shows that no extra dynamics are necessary to "kill" quantum effects on macroscopic scales, because within the laws of quantum mechanics, it is already explained.


Decoherence theory explains that if you have an entangled system that is not sufficiently isolated it will pull in the environment (including the researchers) quickly into the superposition. This explains why a human won't see a "blur" or whatever if you look at Schrödingers Cat. But this seems apparent from the start unless you subscribe to the Copenhagen interpretation literally (something I'm pretty sure not even most of the earliest QM researchers did).

Since decoherence is just a linear (normal) process in the linear QM realm, you still need to postulate the Born rule and subscribe to one of the interpretations thereof, even if you're an Everettian, to make predictions. I think this is the main open ended question in QM..

But sure, there are many fringe theories going around and doing tests to falsify them would be good, it just seems steep to do it at $1B and the experiment only pushes the limit quantitatively, not qualitatively (as the same experiments are done on Earth regularly with lighter objects).


You're just moving forward the limit a bit by the proposed experiment - if it works (like is generally assumed I think), the sceptics will just say that the breakdown limit is higher than the mass of the glass-beads.. :)


I wonder how precise the measurement of those trajectories would need to be, to detect deviations due to special/general relativity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: