> Just as a pixel is the smallest unit of an image on your screen and a photon is the smallest unit of light, he argues, so there might be an unbreakable smallest unit of distance: a quantum of space.
I thought that that role was fuffiled with the Planck Length. I've seen it used in conjunction with the speed of light to calculate a theoretical "refresh rate" of the universe, was that based on a flawed assumption?
Planck units are what you get if you set the gravitational constant, the Coulomb constant, the speed of light, Boltzmann's constant, and the reduced Planck constant all to 1, resulting in the elimination of constants in equations like Newton's law of gravitation. The argument was that using such units eliminated anthropic bias.
Well, the immediate problem with such arguments is that it's not clear that those values are the right ones to set to 1. The original discussion started before Einstein developed general relativity, where 4piG crops up more than G itself. Likewise, it arguably makes more sense to set permittivity of free space, not Coulomb's constant, to 1, which eliminates the constants in Maxwell's equations. On the other hand, using the charge of ⅓ that of an electron would mean every particle (so far as we know) has integer charge. Planck charge (under any of the values mentioned above) is not any comfortable multiple of that charge.
Effectively, arguing that Planck units hold any physical significance amounts to modern-day numerology for the most part, just as much as when some physicists tried to argue numerological principles for the fine structure constant being exactly 1/137.
Was just wondering the same. I suspect the Planck Length is a lower bound on the hypothetically smallest unit of distance, however in practice perhaps it's larger. The difference between this experimental resolution (10^-18m) and the Planck Length (10^-35m) is absolutely enormous though, so a negative result would still be very inconclusive and highly likely.
If there's a smallest unit of distance, shouldn't there be a shortest wavelength? And thus a smallest difference in wavelength and maximum energy? And a smallest energy change? And thus there must be a minimally different relative velocity, to preserve discretization of relativistic energies and momenta? And a thousand other things we also have no evidence for?
> If there's a smallest unit of distance, shouldn't there be a shortest wavelength?
If I remember correctly a photon with a wavelength of the plank length would become a black hole.
> And a smallest energy change?
No, small energy is large wavelength. Not all energy must be representable by wavelength anyway - imagine an energy difference so slight the equivalent photon has a wavelength greater than the size of the universe.
> And thus there must be a minimally different relative velocity
No, as per above.
> momenta
Angular momentum is quantized, so study that and it might help.
> And a thousand other things we also have no evidence for?
We can still theorize.
PS. To whoever downmodded him: Only downmod offtopic and stupid. Questions, even those with an anti-authoritative tone should not be downmodded, but rather answered.
Is there any reason there can't be a photo with a wavelength greater than the width of the (visible) universe?
It seems like a photon with a 96 billion lightyear wavelength has 2 * 10^-52 Joules of energy. (1 * 10^-33 eV); is there any reason a photon can't have that little energy?
Among other things it would take 96 billion years to create the photon.
So that means whatever process created it needs to take that long. I'm not sure what would happen if the process is interrupted before it's complete. But I suspect the photon would "go back in time", and never have been emitted in the first place.
This "time travel" doesn't pose issues because the photon takes that long to be detected, so if it's interrupted it simply wouldn't be detected.
Wouldn't it only take 13 billion years if you'd started at the big bang and a photon that had the wavelength of the then observable universe (since space stretched under it)?
I also have to ask how that jives with it being a quantized change: either the photon emits or it doesn't (since a photon exists with that energy or it doesn't) -- or there's some probability distribution that we'll detect the photon (which might change over time) -- but how can it be half emitted 46 billion years in the process?
In the end, either it provides a kick at its energy level to another property (eg, electron momentum) in one quantum jump.
Hmm. In an expanding universe, what's happening to the Planck length? Is it staying constant? Expanding at the rate of expansion of the universe? Doing something else?
> And a smallest energy change?
> And a thousand other things we also have no evidence for?
It's called quantum physics because it was proven that there is a small amount of energy which is basically the indivisible unit of energy. So yes, that exists, and has been proven. It is the whole point of quantum physics, quantum referring to an amount and the smallest packet being a "quanta of light":
https://en.wikipedia.org/wiki/Quantum
You misunderstand your link. A quantum is the smallest unit of energy in that particular interaction. In other interactions the quantum could have a different magnitude.
The significance of the Planck length is so far based heavily on conjecture. Some theories of quantum gravity look at it as a minimum distance two events have to be apart for them to be distinguished, others incorporate it in other ways, and some don't imply anything about it at all. As others have noted, we're nowhere near being able to use physical evidence to decide it.
The Planck length has no physical meaning. It's just a combination of physical constants that gives a length. It's considered a scale at which quantum gravitational effects become significant, but it's just that, a scale. It might be a factor of 100 off from something significant, but what that thing is we have no idea.
I have read alternate interpretations of the Planck length/time/volume/etc.
One says that they are a mathematical construct with no meaning, the other says that they provide a bound for the limits of the measurement of reality as we understand it (which many people take to mean the smallest unit).
Do you think these definitions disagree? Does one definitely supersede the other? If so, could you point me at some literature?
Not a physisist, but I found this link [0] to be understandable. The Heisenberg uncertainty principle says that the more accurately we know position, the less accurately we know momentum. When you formalize this, the result is that as the size size of a particle you are trying to measure decreases, the energy necessary to measure it increases. Since energy causes gravity, there is a length, l, where measuring a particle would require so much energy that the resulting gravity creates a black hole.
I think it is fair to say that l is just a mathematical construct, since nothing really happens at this point. But it is a useful mathematical construct that is, theoretically, relevant to our ability to observe the universe.
Having said all of that, all of this is at a much smaller scale than we have ever been able to observe. Seeing as we needed to invent new physics (relativity) to explain scales as small Mercury's orbit [1], and as large as atoms, and that these two theories are still incompatible, it is highly likely that describing physics near the plank scale would require another reinvention of physics.
Yes I too can contrive a thought experiment that combines a bunch of physical constants. It's hardly a construct. It's just a length. It might mean something. It might mean something when multiplied by pi or a tenth or whatever. It's the scale at which quantum gravity might maybe possibly could be important. Nothing more nothing less.
I would not consider the explanation I describe to be a thought experiment, but rather a mathematical derivation; although I might showing my roots as a mathematician instead of a physicist there.
Additionally, all the argument, as I presented it, shows is that there is some length beyond which we cannot observe. It turns out that we can calculate this length, and the result happens to be precisely the plank length. As a mathematician, this seems highly unlikely to be a coincidence; however I am not familiar enough with physics to know if this is a deep result, or a trivial consequence of it's definition.
To clarify the disagreement: do you agree that our current theory predicts that there is some length beyond which we cannot measure?
Yeah, that would make sense, but I don't imagine it's a hard cutoff. It may be that you get diminishing information per unit energy at a length scale related to the Planck length or something.
Well, if current physics is worth anything, a photon with a wavelength around that scale is probably going to spontaneously collapse into a mini-black hole. So that probably does count as a threshold of some sort.
Yes, some of the attempts at a theory of quantum gravity indeed are hinting at a "quantum of space" around the Planck order of magnitude.
However, theories that assume a "fixed grid" sort of arrangement tend to run into issues. If space is quantized, the pattern is probably not the fixed rectangular sort, but something else instead. Perhaps the notion of space arises as a smooth approximation of some kind of graph of nodes and edges of some kind.
Source: Lee Smolin's books.
(So the search for the "quantum of space" is legit, it's just not likely to be fruitful very soon because of the scale involved. But someone's got to try it.)
My understanding of all this is that "space" and "time" does not exist. There is only quanta of energy (quantum field theory) from which we should be able to derive space and time. Which does not means there nothing smaller than planck length.
But I learn all this by watching discovery channel so I would appreciate if somebody smarter can summarize idea that space and time derive from something else.
I thought that that role was fuffiled with the Planck Length. I've seen it used in conjunction with the speed of light to calculate a theoretical "refresh rate" of the universe, was that based on a flawed assumption?