Hacker News new | past | comments | ask | show | jobs | submit login
Closed loophole confirms the unreality of the quantum world (2018) (quantamagazine.org)
45 points by gyosifov on Feb 15, 2021 | hide | past | favorite | 77 comments



The problem is the insistence on the existence of particles at the quantum level. We know that quanta propagate like waves, but interact with classical level systems in discrete units. So we need to stop believing in discrete particles with distinct properties at the quantum level and come up with an explanation of why quantum waves appear particle-like at the classical level. Probably the answer will involve the way in which de-coherence occurs when a large number of quanta are involved.


Replace particles with fields in which particles are the result of excitation in the fields. Fields are the fundamental stuff of reality, and particles are emergent from field interactions.


> Fields are the fundamental stuff of reality, and particles are emergent from field interactions.

I'm not sure in what sense it is a reality. On observable scale the reality seems to be more tied to particles, at least in our human understanding. Wave-particle duality concept to me seems just as hard to reconcile as the concept of irrationality or infinity.

These concepts work well to power our present explanation of the world around, yet can't naturally root in human brain (not mine at least).


Fields are continuous. E.g. electromagnetic field in the classic electrodynamics is continuous. But its waves, photons, show quantum, discrete properties — see the photo effect. It would be easy if there were some common grid, but it's not the case. You can make photons with whatever energy from a continuous spectrum, but photons of each frequency will carry its own quantum (Latin for "amount") of energy, in discrete portions.

Why is this happening? That's the question worth 1000 Nobel prizes.


It's called second quantization and is taught in the course of theoretical physics. https://en.wikipedia.org/wiki/Second_quantization


This is the how, not the why. That is, it's not a (more general) model from which quantization follows. Obtaining such a model would be a major breakthrough, provided that it's falsifiable in humanly-attainable conditions (unlike the string theory).


This answer to this is actually known and is a consequence of Heisenberg's uncertainty principle. Here's a youtube video about it: https://www.youtube.com/watch?v=QPAxzr6ihu8

Of course, Heisenberg's uncertainty principle is a mathematical description of observed behaviour so one could rightly argue that it doesn't really explain anything - it merely describes things.


Heisenberg's uncertainty principle should probably be put the rest at this point. From the beginning it was more "capturing" our inability to move beyond the physics as we know it. It's like tying your shoe laces. I always found it fascinating how firmly most physicists believe in the equations someone came up with, just because it agree with measurements. I mean that's real nice and all, but just because something agrees with measurements, doesn't make it a fact. There are literally infinite ways to create equations that satisfy measurements. But sure, a few decades of research in the early 20th century and that's it. This is all we can do. Let's just accept that lol.


Our current models are only good until we find a better model that replaces them, one which can explain phenomena we observe , which the old model didn't.

So, yes, it's possible there is some physics "beyond" the one we know, and yes in order to go beyond what we know we have to consider the possibility that some of the stuff that lay at the foundation of the current physics is wrong (or, correct up to an approximation). And many working physicists are well aware of that fact and they do consider all the options on the table.

The problem is, you have to find something to replace it and it has to work.

Thinking about all crazy things is great. "Temporarily" throwing away some assumptions can be a productive thinking tool. The Heisenberg principle (like many other things) can be both something you want to keep and use as a foundation for other explorations, and it can be something you question. The field is made up of many people, not everybody should be working on the same thing on the same assumption; I think "putting ideas at rest at this point" should be relegated to old theories that have been fully superseded, and even then they can be still useful: even Newtonian mechanics can still be quite useful even if we know it falls short.


That's how all of physics is constructed. Even things you may think are obvious, like Newton's 3 laws of motion, are only accepted because they agree with our measurements. How else should we determine their validity?

> There are literally infinite ways to create equations that satisfy measurements.

There really aren't. You seem to be thinking of something like in the movie The Number 23. But we're talking about equations, not numbers. Take Newton's Second Law (f = m • a). What equation can you write that expresses that relationship that can't be simplified to f = m • a?


The uncertainty principles are inequalities, not equations. And that is something you can write with multiple forms.

As for equations, notice that many equations in Physics have a constant which turns a proportion to an equation. This is where you have leeway in constructing more or less arbitrary equations based on the variables you think are important enough to observe.

Coming to your example, Newton's f = ma equation is really f/m proportional to a. The units are chosen carefully to make the constant 1. This works under the assumption that mass is constant and acceleration is measured measured from a non accelerating frame of reference with non relativistic speeds. So, yes there are several other ways to write that equation.


> Newton's f = ma equation is really f/m proportional to a.

IIRC, in this context the mass m is considred as a proportionality coefficient. Such that the force is propotional to the acceleration.

Sure, rewriting this would fix the constant to 1, but this introduces a concept of specific force, force per unit-mass.


f=d(mv)/dt?


We should distinguish between quantization, and "discrete particles with distinct properties." For example, the photoelectric effect demonstrates that light is quantized, but individual photons have no distinct identity whatsoever.

"Number of particles" is an observable like position, momentum, spin, etc. It is a quantum property which may or may not commute with other properties, and occupies the same conceptual space.


Number of quanta is observable. You can count photons.

OTOH it does not imply that a photon is a "particle" in the same sense a breadcrumb is, with a well-defined diameter, shape, or borders. It is a portion of energy of electromagnetic field that was sufficient to yank an election from a photo element's crystallic grid.


Lori Gardi, widely believed to be a crackpot, in my opinion has made a breakthrough in our interpretation od quantum mechanics.

If I understand her correctly, she proposes that the reason waves behave like particles is because they transfer energy essentially every zero-crossing : for example a photon is just one half of a wavelength of light.

She has a much more interesting explanation of what i am very poorly tryingto spit out here on youtube. Bear with, because honestly she is a whackadoo but she is a smart whackadoo.

Lemme know if you want a link to the video


Why is it always YouTube videos with these (let’s generously say)esoteric ideas and never a link to a paper or essay?


Not GP but since you asked, here it is: https://www.researchgate.net/publication/325462944_Planck%27...

The argument is: Planck's relation says that a photon's energy is in proportion to its frequency. But a higher-frequency photon oscillates more times per second. If you look at the energy of a single oscillation, you get a constant, regardless of frequency. This is remarkable and so we should reframe Planck's constant as the fundamental "energy per cycle."

The problem is that "energy of a single cycle" cannot be related to other measures of energy, e.g. the binding energy of an electron in the photoelectric effect. Basically it seems like unit sophistry.


Sophistry is a bit strong of a term in my opinion.

Physicists have received Nobels for inventing particles to balance equations. Why isn't that sophistry? Maybe we just need to spend a few more billion dollars on a brand new particle accelerator to figure that out.


Yes I watch her stuff as well I appreciate her work although its not always consistent video to video. I recommend people check out https://forgottenphysics.com/ Just a guess but I think this is probably where Lori got most of her inspiration at first


QM is the new Ptolematic system: it makes good predictions, but because it puts Earth in the center, the orbits become needlessly complex and artificial.


Explain how QM metaphorically “puts Earth in the center”.


That's the thing we don't know yet.

It's probably a good bet that we are going to discover that to rectify the complexities of QM, there is some assertion most people have fundamentally accepted and is quite uncomfortable to let go but is necessary to simplify the equations. I can't really guess what it's going to be though.


Have you ever actually looked at the Schrödinger equation? [1] It's pretty simple. If you've had freshman physics, you can learn enough to derive it in a matter of days or at most weeks. There's not a lot of room in there for the kind of thing you are suggesting.

It's much more plausible that the uncomfortable human-centric thing we need to let go is the idea that our perceptions about macroscopic reality should be indicative of how things actually are.

[1] https://en.wikipedia.org/wiki/Schrodinger_equation


The system of epicycles was made of pretty simple, easy things: circles. It's when they were combined things went complicated.

Same with Schrödinger equation: it's very simple for one particle in an empty Universe, but things go more complicated when more particles are considered.

There is no guarantee that a different approach can replace QM and describe things in a simpler way. But I suspect that attempts to look for such approaches are not useless.


What do you mean by "derive" the Schrödinger equation? As best as I know, there is no "derivation" beyond some analogy with the classical wave equation. It's an equation penned down by Schrödinger, motivated by the classical wave equation and insights from de broglie's theory.

I'd love to see a derivation if it exists.


I have in mind things like this [1]. I no longer have my undergraduate text or notes but I recall going through a similar exercise.

[1] https://arxiv.org/pdf/physics/0610121.pdf


Indeed, early progress with QM in the first place required letting go of some expectations that were based on macroscopic phenomena. There was a certain effort to clarify those expectations so they could be gotten rid of. This is, at least, what I remember being taught about it.

So, maybe it's time for another round. ;-)


Circles are also simpler than ellipses. Maybe that's a clue. Maybe we need a more complicated equation that makes the phenomena more understandable.

EDIT: To stretch the metaphor a bit more. Maybe QFT is the analogous of the solar system described with epicycles.


That would, indeed, be an uncomfortable thing to let go of, since they are our direct perceptions.

Relativity tells us things get weird at high velocities, but our daily perception is basically correct. Quantum theory seems to suggest that solid, tangible reality isn't at all like our day-to-day experience. That's a pretty decent leap to ask of people.


Plato and a few others told about the world of ideas and the world of matter: both exist and are connected somehow. I wonder if QM is this bridge between the two worlds and the wrong assumption we make is that QM describes the world of matter. In other words, the probability waves may actually exist, but they exist in the world of math, and the real question is how exactly the two worlds are connected.


IMO, that assumption is the idea that decoherence occurs at all.

I really think Sean Carroll's ideas involving MWI and emergent spacetime from hilbert space provides a great deal of clarity.


That's not good enough. I want a specific claim about how one or more of the postulates of QM are likely to be incorrect (or even stranger, how they have to do with "putting Earth at the center", whatever that means, or some anthropomorphism baked into the postulate).

Saying "we don't know what we don't know" is both a tautology and completely useless. It's like my saying: "I believe that you committed a crime last week." You ask for more details about why I think that and what the crime was, and I reply "we just don't know." I hope you see how ludicrous that is.


It's been awhile since I thought deeply about epicycles, but if my memory can be trusted... They weren't incorrect. They did a decent job of predicting the motions of the planets given the assumption that Earth was pegged at the center of the coordinate frame (which then required a whole bunch of "hidden behavior" in terms of non-apparent phenomena dictating the rotation of all other bodies).

Re-framing the whole system to be heliocentric allowed for the massive simplification of Kepler's Laws (and, not too much later, the further simplification of describing Kepler's laws via Newton's law of gravitational attraction).

We currently have an understanding of QM that doesn't reconcile well with macroscopic observation and requires a lot of intuitions to be broken (including, possibly, the singular nature of existence, time-forward causality, or lightspeed-constrained information locality). I can't help but wonder if there's some equivalent to "Assuming Earth is the center of the coordinate frame" that we currently do that forces these unintuitive (though working) solutions.

The solar system moved in epicycles because "it just did," until several leaps of intuition allowed us to see how it didn't. I wonder if there are similar leaps of intuition waiting on the horizon to allow us to explain quantum entanglement without nonlocal information sharing (hidden variables, we have shown, is not it).


> We currently have an understanding of QM that doesn't reconcile well with macroscopic observation and requires a lot of intuitions to be broken

Why should physics care about human "intuition" at the macroscopic level? That's an incredibly anthropic point of view to take on a cosmos that couldn't care less if we didn't exist.


When Newton developed the theory of gravity it wasn't intuitive that all matter in the universe attracts all other matter. But also wasn't incompatible with our intuitions---it was learnable.

What I see in QM right now "smells" like we're having to compromise too many intuitions to reconcile what the experiments are showing us with the way we expect the universe ought to be working. I agree that the universe is under no obligation to work the way we expect it to---problem, what we have right now in QM is a lack of agreement on how to even rework our intuitions to match our observations and the reality the theory describes. That suggests to me that---much like epicycles, or like gravity being separated from acceleration before relativity---the stories we are telling ourselves about what we see are still too complicated, and a simpler explanation that requires us to sacrifice fewer pieces of intuition has not yet been reached.

This is, to be clear, extrapolation. I have no way to know whether such simplification exists or whether we have hit the boundary where nature actually refuses to conform in such a way to our senses and minds that we can change our intuitions to follow her. It's not impossible that this is the case. But if it is, it's a break from the pattern of physics discovery up to this point in human history.


Round Earth and heliocentrism were counterintuitive too and people struggled a lot to understand them, it's not unprecedented. Ironically Aristotle refuted heliocentrism with the same argument, that heliocentrism doesn't correspond to observation.

We have intuition to understand QM, people struggle with myths, not lack of intuition. Absence of conservation of energy is one of those myths, only those myths need to be sacrificed.


The “putting the earth in the center” is wave function collapse. If you get rid of wave function collapse then all the paradoxes go away except that our experience is a tiny branch of a massive tree


That might be the case. But something still smells in the many worlds theory---it's hard to reconcile conservation of energy or mass with the notion of an infinitely branching multiverse, and if there is no way to interact with those other branches, are they real or just a story we're telling ourselves to try and justify the math?

if they are just a story, it feels like a complicated story to get to the goal.

At the same time, I completely agree that the story of wave function collapse also smells. It creates a distinction between observer and observed reality, and that creates its own set of paradoxes (we can put Schrodinger's cats in nested boxes all day long, and some of the boxes even include experimenters who think they are observing the cat, but experimenters outside the lab don't agree with them that they are the observers!).


> That might be the case. But something still smells in the many worlds theory---it's hard to reconcile conservation of energy or mass with the notion of an infinitely branching multiverse, and if there is no way to interact with those other branches, are they real or just a story we're telling ourselves to try and justify the math?

Looking at the multiverse as branching is where the confusion comes in. There's one Universal wave-function. Why we don't experience the wave-function directly is the mystery.


In MWI energy and mass conserve, it's the state that branches.


It seems that way to me too but it (i.e. the notion that QM is needlessly complex and akin to the Ptolematic system) is nothing but conjecture without some candidates for assumptions that might need to be challenged.

[edited for clarity]


It's quite the opposite. QM was discovered while we were trying to resolve certain problems such as the ultraviolet catastrophe. [0]

To say it is "nothing but conjecture" is incredibly condescendingly dismissive and ignorant of the history of its evolution.

[0] https://en.wikipedia.org/wiki/Ultraviolet_catastrophe


I think the commenter meant that the notion that QM is needlessly complex and akin to the Ptolematic system is nothing but conjecture.


Yes, thanks, I edited out the ambiguity.


Has anyone ever considered that perhaps the particle/wave actually goes back in time?


Yes, that's the "transactional interpretation" of quantum mechanics.

https://en.wikipedia.org/wiki/Transactional_interpretation


There was an Israeli team that "proved" quantum particles can go back in time by linking groups of entangled particles that didn't exist in parallel.

Would have to look for the link, and don't know if that was ever rejected, but.. maybe?

By extension, spooky action may be ignoring time entirely.


Quantum mechanics definitively contains nonlocal effects (entanglement), which does seem to imply some kind of time traveling information, at least when combined with special relativity.


Kind of. Quantum states don't have on place or time. Only objects that share those states have locations.

The concepts like locality and physical object have become more specific over time.

Before QM, principle of locality in physics matched the common sense of locality in every sense. Object is directly influenced only by its immediate surroundings. Objects have locations. Field must mediate the action over distance.

After QM was developed, principle of locality in physics became more refined. Quantum states don't have location in space and time. They are not 'physical objects' themselves. Quantum states is everywhere in time and space where objects sharing that state are, and everything correlates perfectly over time and space.

Before QM, something like quantum-state would have been called a non-local thing. In current more refined meaning it's not breaking locality. Non-local instant quantum state 'hovering over wast distances of space and time' is OK as long as causality is local and moves at the speed of light.


I thought nonlocality was the obvious answer to "no hidden variables" but instead so many prefer to hidden multiverses instead. ;-)


Carlo Rovelli


Quantum mechanics is not the simplest explanation around.

Plancks energy equation should be

E=htf

h should be in the units of Joule as opposed to Joule second.

This formula reveals the true nature of the photon: its not a fundamental particle. What we regard as a photon today is the energy of 1 seconds worth of light at some frequency f.

Every wavelength of light contains the same energy, h, regardless of frequency.

When you realize plancks constant should be in Joules you will shortly realize that the fine structure constant actually has units of seconds.

Another neat thing is that light has mass: h/c^2 (ridiculously small). If it has mass then it should have net charge as well. This could explain why light curves without needing general relativity.

Coulombs law using permeability of free space instead of permittivity reveals that coulombs law is a long form of expressing E=mc^2

F * d = muc^2 q1q2/d

E = (h2alpha/e^2c)(q1*q2/d)c^2

Edit: formatting


I lack the requisite amount of sleep to evaluate the truth of this post, but I did really enjoy the fact that this appears to be an unusual explanation of basic concepts that doesn't immediately devolve into mysticism.


Thank you for your consideration, I usually get down voted for saying this on here.


Do you have any references for this conjecture? It's an interesting idea, but I'd like to see any more work done exploring this line of thought.


I recommend reading the papers here https://forgottenphysics.com/

They're all short reads. There's one of them that goes into the history of E=hf and how Planck actually originally had E=htf but Boltzmann criticized him for it


Here are the easy-viewing Wikipedia pages on Wheeler's delayed-choice experiment[0],

and my favorite the Delayed-choice quantum eraser[1].

[0] https://en.wikipedia.org/wiki/Wheeler%27s_delayed-choice_exp...

[1] https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser



QM is obviously very true in the sense that it makes useful predictions (in that sense, truth is somewhat a quantity). But it is arguably very inelegant in its seemingly ad-hoc introduction of probability. Naturally, one would hope to one day get a more principled explanation of this randomness.

What would be needed to make physicists search for such an explanation? Some kind of pattern in the "randomness"? Say, photons behave more like particles on Monday mornings?


What you want is a theory of local hidden variables. [0]

They are at odds with what quantum mechanics predicts (and what we have observed).

On a semi-related note, I might be reading too much into your comment, but I very much dislike when people imply that scientists haven’t thought outside or the box or tried other non-mainstream theories. They try all the time, but fail because those theories often aren’t true. What we’re left with are the best extant theories, even if they’re obvious incomplete.

[0] https://en.wikipedia.org/wiki/Local_hidden-variable_theory?w...


It is true that at my level of understanding I would intuitively prefer hidden variables. But that's not my point.

My point is that the notion of "measurement" or "wave function collapse" simply don't seem to explain anything. Instead they are just different words for what we observe. So yes, that is certainly useful but it also seems to be limiting.

A simple question would be: What is "measurement", i.e. what is the fundamental thing that forces a probability distribution to yield a concrete value? And why does it exist separately from said probability distribution?

Edit: to make things even clearer: I am not lamenting that there is no one working on mathematically consistent interpretations of QM. I know people are doing that and I know that this is difficult. Instead I am asking what would be a clearly visible limit of QM. Where would we, as a society, encounter a situation where we say: "We really need to explain the reasons behind QM or we won't get that problem here solved."


No.

Correct theories are suppressed all the time by oligarchs. See the electric car.

This assumption that the best is chosen is part of the problem of the scientific institution - the assumption that corruption doesn’t play a major role in what is allowed through the gate keepers.


If you go with the Many-Worlds Interpretation, then trying to remove the randomness is nonsensical: the world just branches (in proportions as described by the Schrodinger equation), people on every branch have experience, and "randomness" is just what branch you find yourself on. The randomness of what branch you find yourself in is just like the randomness of what person you find yourself born as.

All classical theories and interpretations of QM already have indexical uncertainty (the randomness of what person you find yourself born as). MWI avoids adding any new kinds of entities not implied by the Schrodinger equation and effectively explains away quantum randomness by implying that it's the same thing as indexical uncertainty, instead of being a separate kind of randomness.


On the contrary, it is still an open problem to rigorously derive the Born rule probabilities in MWI just from this perspective. Either way, the response will be equivalent to the Born rule: instead of positing that a measurement'outcome is proportional to the amplitude of the state, we posit that the number of branches in which an outcome happens is proportional to the amplitude of that outcome. Not sure why these are fundamentally different.

Also, the MWI idea of branching is no more satisfying or intuitive than the wave function collapse, which at least doesn't require an infinity of universes out of which some are much more probable than others.

Note also that there is only 1 of you in MWI, you just exist with different amplitudes in different states, but when interacting with another object, you become entangled with a single outcome and thus can no longer perceive the other states that other versions of you perceive. This is important, as otherwise physical quantities would not be properly conserved.


The Born rule is statistical distribution, and statistics is certainly computable in MWI. It's a matter of converting the wave function into distribution basis, where it will have the Born rule distribution with amplitude close to 1, which means that observations have the Born rule distribution in most cases.


This is one thing I love about MWI... it has at least some kind of explanation for randomness. For every other interpretation they do some stuff and then choose a random outcome, and I want to ask "and how does that random step happen?"


But it isn't that simple. There isn't a "proportional split" that can depend on the Schrödinger Equation. This is exactly why QM behaves non-intuitively in the first place - you can't add up probabilities from individual events, you have to add complex numbers and only at the "end" you square them and get probabilities. So it's would be a nice analogue in MWI that the world just splits up and that's why probabilities arise, but it doesn't fit.. :/


This is a misunderstanding of QM. Probability isn't what's unusual, it's the Born rule and the fact that the world doesn't use probability laws over real numbers but over complex numbers. That's what gives waves some sense of reality.


The probability waves may be regular real waves, it's just convenient to describe amplitude and phase at every point with a complex number. We could describe water surface the same way, but it wouldn't mean water is complex.


The point is that waves are inherent in the description, whether it's complex numbers or amplitude + phase.


> What would be needed to make physicists search for such an explanation?

I'd say confirmation of a failed prediction by QM is what's needed.


Local hidden variables as an explanation has been experimentally refuted: https://en.m.wikipedia.org/wiki/Bell%27s_theorem


The probabilities spawned QM, not the reverse. The modern interpretation of QM is a reification of those probabilities - an insistence that they aren't the results of a process, but are just probabilities, irreduceably.


And it's not just like people found these probabilities and then decided to stop trying to explain things and said "what if these are just how physics worked". Instead, people found that any attempts at removing the probabilities (through local hidden variable theories) ended in contradictions, and then accepted that they must be fundamental.


> explanation of this randomness

what if randomness is an inherent property of nature?


The problem is that QM doesn't allow randomness at the fundamental level. The Schrodinger equation is a linear differential equation, systems that evolve according to it evolve just like classical mechanics.

However, when you measure the state of such a system after however many steps of perfectly deterministic interactions you want, you find the system takes only one of the many possible states predicted by the Schrodinger equation, with a probability that depends on the amplitude of that state.

It is this discrepancy between the deterministic nature of the quantum world and the classical world, but the probabilistic nature of the crossing between them, that people find disconcerting.

MWI even does away with this to solve extent, explaining it as a kind of observation bias: as a particle interacts with a very independent system, it loses its ability to interact with itself (decoherence), and so we get many versions of the system each interacting with a single version of the particle, which stimulates classical physics for each version. From the perspective of any particular version of this system it is random with which particular version of the particle it interacts, even though at the universal level there is no randomness.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: