https://www.nature.com/articles/nature25792 is the more interesting article, I think, as it describes the experimental measurement that forms the basis for the claim about early-universe gas temperature.
Being the skeptic I am, I suspect that it's more likely that there is some systematic error that is being underestimated or overlooked rather than new physics. Fortunately the experiment is simple enough that it can be reproduced elsewhere relatively easily.
Interestingly, the region around 75 MHz is between two ITU broadcasting allocations. They claim to rule out the FM band as the source of the higher spectral edge through several possible channels, but I wouldn't be surprised if there was some way to get excess power in those bands in some way they hadn't considered (although I guess any strange propagation mechanisms would probably be time-dependent).
Last I heard, reionization was not understood in detail so I find it surprising the author here is confident they found a dark matter signature. Not being able to read more than the abstrac I can’t say for sure, but it also looks like this is not an experimental paper but a theoritician trying to put put a quick paper: no mention of the instrument in the abstract nor comparison to existing exclusion limits on the dm today.
There are two simultaneous papers in Nature (which I linked to), one describing the experimental result from the EDGES, the other an explanation for the result.
I am wondering, can't there be something like gravitational cooling? When playing around with gravitation simulators one observes that sometimes high velocity parts leave the bulk, which in my understanding leads to a cool down of the pack (the escaping part carries away kinetic energy, after all). If I let some "cool" matter interact with "hot" matter only gravitationally, I'd expect that this leads to an equal "temperature" eventually. Is it, that due to the very weak interaction, time scales are too long for this?
From studies of the cosmic microwave background we can be pretty confident that temperatures were very uniform at the time of recombination, and that the cooling of the CMB (which was fixed at that time [1]) closely resembles the adiabatic expansion of an ideal gas. So we're already seeing cooling-by-separation. If we have a decent idea of the history of the expansion, then we also have a decent idea of the temperature of the CMB at the surface of last scattering. Since we look so intently for tiny overdensities or underdensities to seed structure formation (collapse of matter into early galaxies), it's hard to imagine where one could hide an enormous underdensity or overdensity that could cool the CMB across an enormous region (we see the same temperature along widely angular-separated lines of sight; this is the horizon problem [2]) without leaving fingerprints.
The period between recombination and "first light" from the first stars (triggering reionization) is not especially well understood, but the expectation is that the expansion of the universe would be pretty smooth in that epoch. So the neutral hydrogen around the time of first light being cold enough to produce the unexpectedly strong 21cm absorption line is an oddity.
If evidence holds up suggesting that this gas is uniformly colder than expected then we could take the position that our idea of the history of the expansion is somehow wrong, or alternatively that some other process was at work at the end of the dark ages. We have constraints on both; and in particular "some other process" is tricky because the almost-exactly-black-body spectrum of the CMB puts constraints on early cooling-by-clumping (where cooling is up to "chemistry", where that's a(n exothermic, in this case) reaction between particles of different species).
There could be "chemistry" in the dark matter sector that allows dark matter to cool by extremely local clumping (for example, if in the "dark ages" before recombination there are two oppositely-charged dark matter particles and at recombination they combine into one neutral dark matter composite particle, analogously to how electrons and protons recombined into neutral hydrogen), but then you have to figure out how to have the dark matter be colder than the hydrogen and how to thermalize the hydrogen to the dark matter. You also have to avoid smearing out small matter density fluctuations.
If structure formation works the way we think, everything at recombination seems too uniform and diffuse for local gravitational interactions to do the job of transferring much momentum from the matter sector to the dark matter sector in time for the 21 cm absorption line seen by EDGES. There are also constraints on extra long-range interactions, so Carroll's raising the idea of a zero-mass boson mediating the interaction is hard to imagine (a massive mediator would be shorter-range, in general, and could carry more momentum between matter and dark matter, but there are constraints on such interactions from the much later universe -- for instance, we would expect missing momentum in particle physics experiments, along the lines of the missing momentum that led to the discovery of the neutrino).
The problem is that there are lots of strongly correlated observations and pushing at one tends to cause problems with one or more of the others. But that's what theorists in this area (model building) seem to like to do. :-)
Thanks for the elaborate answer. As far as I understand your fifth paragraph refers directly to my question. How would uniformity rule out purely gravitational interaction? As far as I see it gravitational forces should be enough to get two reservoirs of different temperature onto equal temperature. It's just the time scale which may be a problem. And if I start already with two (almost) uniformly distributed reservoirs, long range interactions shouldn't make a difference. Inflation should have flattened out everything already anyways.
I don't understand your idea. Gravitation squashes collisional matter together, heating it. The hot matter radiates, allowing it to fall deeper into dense structures. I don't see how this cools non-infallen hydrogen, rather than lighting it up.
I'll try to elaborate: Let's say we have to reservoirs of baryonic matter, one cool and one hot. If these interact through radiation (electromagnetic forces) both reservoirs eventually get the same Boltzmann distribution and thus temperature. Now we have the same situation but we switch one reservoir to dark matter (the cooler one) and the force between the reservoirs is purely gravitational. Beside the time scale, would that be any different? In a large collection of masses, particles can exchange momentum and energy through gravitation, afair. Therefore one could expect that the two reservoirs equilibrate. I think what differs may be that gravitation is only attractive and contraction may be faster than equilibration. But is this the case if the initial temperature of the constituents is high enough, especially of the baryonic stuff?
Oh sorry I missed the "switch one reservoir to dark matter".
In that case, if the DM and matter can only interact gravitationally (and DM-DM interactions are also gravitational), then you have a dance wherein the collisional normal matter sheds momentum via radiation, letting the normal matter fall into the DM eventually. The momentum exchanged between the DM and the normal matter will be pretty small.
(This is similar to what people pursuing the cosmic web hypothesis try to model, essentially, and is what's conjectured to be happening during the dark ages).
ETA: the key thing here is that the collisionless cold dark matter will stay cold, while the normal matter will get much hotter (and light up normal matter further away still). Isn't this the nub of the EDGES result?
I suppose you could attack your question by considering a DM galaxy and a baryonic galaxy binary. On any given spacelike slice you can probably represent the two as a barbell with the barycentre on the infinitesimally thin bar, although there are lots of 3+1 formalism gotchas that would have to be dealt with.
If you arrange the matter galaxy so that its internal structure is sufficiently stable then the whole binary system would lose momentum-energy to gravitational radiation. It won't be much, and that's a big "if" (and the "if" also works against trying to see the matter galaxy cool, especially as the energy loss would tend to contract the "bar").
You might enjoy this crazy 1998 paper, which I just found, which adds a bit of heft to the the "not much" guess above.
It seeks to examine gravitatonal waves shed by a 2-galaxy cluster.
"pessimistic point of view justified by the combination of great numerical difficulties (three-dimensional calculations) together with very disappointing small values expected for the gravitational-wave luminosity, the amplitude, etc"
"In brief, our simulated cluster produces changes in the relative distance of the order of 10-22—detectable with current technology—in a short period of 4 days."
(LIGO 2015 was about the same relative length change (h), but with a frequency of 35 - 250 Hz. I'm not sure how catching part of a gravitational wave of ~ 10^-17 Hz could be caught by a LIGO-like detector, since only higher derivatives of h are useful for identifying gravitational radiation).
(but of course for cooling we're more interested in the luminosity, which is small compared to that of an ordinary galaxy, but a bit more interesting compared to the blackbody radiation of a galactic-mass diffuse cloud of atomic hydrogen, although that's only considering the normal-matter member of the binary, rather than the total energy leaving the region of spacetime the binary is in.)
What would source the gravitational radiation that would thermalize the two reservoirs? The metric sourced by a uniform cloud of hot gas doesn't really lend itself to periodic change (under a slicing) unless it's collapsing far from spherically symmetrically, and even then the momentum-energy carried out by electromagnetic radiation would dwarf the gravitational radiation.
Also, if you let the hot and cold clouds fall into each other, then electromagnetic scattering will do the bulk of the work of thermalization (or more likely the heating of both clouds of inelastically colliding matter; cf. collisonless dark matter clouds sliding through one another).
I get the sense the physicists are a different breed of thinker.. it’s like normal
people are like “nice I have a car that can get me to work and back..” and physicists are like “imagine an atom was a car and you wanted to go to the edge of the universe..” or like “nice a carrot grew in my garden!” And they are like “how could you get a carrot to grow to the size of the earth..”..
I see a low res 2d picture and apparently they see cold particles that are around 3x the mass of protons and the structure of the universe.. kinda nuts..
It's not crazy creativity, it's thinking of things in terms of math. When you start thinking about situations in mathematical terms, you can start playing with variables. When you translate that back into layman's terms, it seems non-intuitive. But there's nothing mystical about it; quite the opposite.
> It's not crazy creativity, it's thinking of things in terms of math.
Thinking of things in terms of math in order to solve problems is intrinsically creative :). I'd agree there is nothing mystical about creativity or mathematics, though.
No, here people don't think "how could you get a carrot to grow to the size of the earth", but instead, "here's a carrot, how could you use CSS to turn it into a recurring revenue stream" :)
When I went to Uni a common pairing was "physics and philosophy". That was my chosen degree subject when I was in high school. I wanted to do particle physics.
In the end I loved (pure) maths more than philosophy so graduated with Theoretical Physics and Mathematics. There were probably a quarter of the physics students doing the Philosophy of Science module (the only things I remember covering were Aethers, Xeno's Paradox, maybe also the Realism of QM Interpretations [it was a couple of decades ago]).
One could just as easily say that mathematicians are natural philosophers clouded by rigid symbols that have no inherent meaning on their own. For every zag, a zig.
Alright, I’ll bite. I challenge your assertion that math is without objective meaning.
First I want to make an observation. Do you know the history of how the study of complex algebra/calculus came about? If so, I assume you will agree that it was initially a completely abstract thought experiment with no connection to anything in the “real” world.
Given your assertion that math is without meaning, it would seem to me that mathematical ideas that originate purely out of human imagination would just be arbitrary semantics.
Then how can it be that complex analysis only several decades after it was formulated turned out to be not only useful, but necessary to formulate the theory of quantum mechanics in a way that agrees with physical observation?
I can name numerous other examples of the same phenomenon; namely that a purely abstract mathematical idea is long after its formulation shown to be profoundly reflected in physical reality.
To me it seems obvious that the way these phenomena occur implies that part of the process by which humans use their imagination and reasoning to come up with abstract mathematical ideas, is more akin to using their intuition to map out objective ‘structures’ of logic (that are also reflected in the underlying structure of physical reality) than to simply play with semantics, as it seems you are asserting.
Tl;dr
Post modernist philosophers are fools and they should be ashamed. Qed
I agree with you that there are many incredible and useful insights based on mathematics and the equals sign, but the main point of contention is that not every mathematical truth has a co-responding physical phenomenon. Nor can we adequately explain how an equals sign works, or why it works. Mathematics and [im]material reality are not one-to-one and assuming that mathematics supersedes the imagination or is a superset of human language and expression negates human experience and renders our lives as secondary to "almighty math." Mathematics is a tool, would you agree? Philosophy is also a tool. Mathematics without a human user is like a video game without a player. I am not asserting that mathematics has no "objective meaning" it _only_ has objective meaning (because for every "object" we must have a "subject" namely, the observing consciousness).
I'm always up for a discussion on this topic, because I find it very interesting and I think there is a major (dare I say it) metaphysical point here that is overlooked by mainstream intellectuals. However I've argued with enough post-modernists online over the years (which is usually like arguing with a wall) that I might have become a bit snarky- sorry about that, I appreciate the graceful tone in your response.
I agree that the symbolic language we call mathematics and reality are not one-to-one as you say. But the fact that we can use abstract reasoning around these symbols to uncover new ways of understanding of the physical world, especially in cases like the one I lined out in my example, implies to me that there must be some objective reality that is in some way captured by these symbols, in a way that plain philosophy cannot.
So to answer your question, I agree that math is a tool, but I think in some sense it also more than a tool. I believe it can also be seen as a map into a platonic reality, and that there is some element of our mind that is able to observe this realm which allows us to draw the map (using mathematical symbolic language) and come to an agreement about how it should be drawn. And that elements of this platonic realm are for some mysterious reason also reflected in the structure of our physical reality.
Splendid, I really appreciate your taking the time to formulate a response -- it's very interesting to consider a metaphysical or Platonic realm where maths, although it may not exist in isolation, ends up arriving at the same points and valleys and landscapes and landforms time and time again. That is actually quite peculiar, the regularity with which mathematics works. I spent some time at the end of my university studies [the first go-round] trying to understand how we as humans came to discover multiplication and division. There are many possible operations we can do on numbers but only some yield a useful symmetry whereas others result in a jumbled chaos.
A close friend of mine refers to humans as "symbol makers" and I hold firmly that everything in the flow of life is meaningful, but it's really astounding that we can filter out useful patterns from our surroundings. Your point alludes to me in a similar way the beauty of a leaf or a tree: it's been speculated and suggested that over many millennia our sensory systems (namely sight) have tuned in and honed in on being able to find tasty ripe fruits and berries (why they may appear red and bright or purple and bright when in full ripeness and before ruin .. to pilfer an Alt-J lyric).
In that way, perhaps maths is some sort of tree or leaf or forest that is naturally existent, not actually separate from the earth or the forest or the consciousness of man, but still somehow a useful set of patterns our [mind] intellect-sense has been able to pick out and find the tasty and juicy bits of.
One very fascinating part of the whole narrative of mathematics is Progress. For example, Kepler and his assistant's calculated observations of the planets, mathematicians dedicating their lives to figuring out n-many decimal places of logarithms and creating reference books, and also equations and derivations. Although maths may somehow "exist" naturally because a set of equations or a set of inferences or physical phenomena may have a mathematical representation, they still need to be discovered (and often re-discovered) to stick around and be of any use to us. To me it still echoes of the personal mission of understanding and critical thinking -- one must come to the solution on their own and verify it in their personal experience to truly feel it and know it to be truthful.
Would you categorize maths as more of an invention or as a discovery? Pure discovery would imply that maths exists on its own like a tree does (or "might" if we consider that a perceiving consciousness must also be part of the 'tree'). Whereas, an invention is something deliberately put together to solve a functional need in the life of man.
Nietzsche solved philosophy; everything since has been quibbling over semantics, there is no original thought anymore. Whereas physics’ Nietzsche was Newton, and there have been continuous advances since his time.
Maybe in the sense the Von Braun solved rocketry. I agree that everything that has come since have been refinements to Nietzsche's developments (dare I say models) but those refinements have brought about the real-world application of philosophy more than Nietzsche ever did.
See Mao, or Popper, or even though I can't stand the guy Chomsky.
Nietzsche and Wittgenstein worked to show many (most) of then current philosophical problems weren't. Bias and language were such a big part of problems that when substracting them, they were rendered hollow. So the subject, as it was until XIX century, could very well be called a problem and yes, it was solved.
Sure, if you totally reframe the entirety of what it means "to solve philosophy" to mean what you said, then gaius's comment makes sense.
I'm not sure I agree that solving a lot of the then/now "problems" with philosophy is the same as solving the entire field of philosophy. Especially since it requires mobile goalposts.
Also, it's a hilarious and an absurd claim "x Solved Philosophy" because that's like saying "Tony Hawk solved the skateboard." Perhaps Tony Hawk is great at skateboarding, but you can never personally know what it's like until you try yourself. Philosophy is not something that can be solved by someone outside of yourself, not even Nietzsche can wipe away your ignorance, only you can do that for yourself.
No. I suppose when you say "nothing" you're talking about philosophical propositions, which can be mapped to computer programs. In which case we can note that (1) the vast majority of programs have never been generated before by any process and (2) the vast majority of possible programs cannot be reduced to simpler programs that might have been expressed earlier.
Who said anything about computers? The whole universe is in a constant state of Recycling. Remember that old adage "energy/matter cannot be created nor destroyed?"
I recall someone (Eliezer Yudkowsky?) writing an essay with the thesis that a couple of millennia of epistemology did not foresee the epistemic issues raised by quantum mechanics.
I get that this is in nature, but isn't fitting to the data with some phenomenological model what every (:?astro)?particle physicist does? As an outsider, I can't tell whether this is significant phenomenological explanation or not.
There are multiple competing models of the early universe. We now have increased confidence in those models in accordance with this new observation, and decreased confidence in those models that are at odds with the observation.
In particular, this new observation constrains the nature of dark matter (lower particle mass than expected and "highly non-relativistic").
Ok, so my understanding is that the term "dark matter" refers to a difference between the theoretical and observed exapansion speeds (accelerations?) of the universe. Can someone summarize for me why this is a useful concept, and not just simplify it to "we have the theory of universe expansion wrong"? What is the evidence for there actually being something like matter involved?
You are thinking of dark energy, which is a completely different beast (also a lot more vague).
Dark energy is the "why is the universe expanding so fast"-stuff and dark matter is the "why are galaxies so heavy"-stuff of the universe.
Galaxies appear to be heavier than what you get by adding up all the visible mass (you can tell by how much a galaxy bends light coming from behind it), so it makes sense to talk about matter. It's dark because it doesn't seem to interact electromagnetically with ordinary matter.
Looking at just all the gravity (dark or not) in the universe you'd expect all the matter in the universe to slowly start clumping together, or at least slow down the expansion of the universe, but we find that things are flying apart, and increasingly quickly so. So there's some force at play, and that's dark energy.
Yes, it's just a hint. This [1] says the confidence level is at 3.8 sigma (5 sigma is the 'gold standard' for this kind of physics). Independent verification of the findings should settle the matter in the future.
There is evidence of the strongly interacting, superfluid dark matter every time a double slit experiment is performed, as it is the medium that waves.
There is ample evidence that dark matter exists, as in, there is something there that we can indirectly observe. We have fairly good ideas about what it isn't, we just don't know what it is for certain.
Dark matter (LCDM) solves some problems and creates others, just like modified gravity theories solve some problems and create others. Hopefully more experimental data will lead to an answer.
Being the skeptic I am, I suspect that it's more likely that there is some systematic error that is being underestimated or overlooked rather than new physics. Fortunately the experiment is simple enough that it can be reproduced elsewhere relatively easily.
Interestingly, the region around 75 MHz is between two ITU broadcasting allocations. They claim to rule out the FM band as the source of the higher spectral edge through several possible channels, but I wouldn't be surprised if there was some way to get excess power in those bands in some way they hadn't considered (although I guess any strange propagation mechanisms would probably be time-dependent).