Hacker News new | past | comments | ask | show | jobs | submit login
Trouble Detected in Infamous Dark Matter Signal (quantamagazine.org)
98 points by dbasedweeb on April 12, 2018 | hide | past | favorite | 22 comments



> Certain isotopes of argon radioactively decay more or less depending on the season.

This sentence puzzled me a bit so i looked at the abstract of the referenced paper: https://arxiv.org/abs/1803.10110

A hypothesis is proposed to explain the long-standing DAMA/LIBRA puzzle. Introduced into the DAMA/LIBRA shielding is a purge gas of nominally high-purity nitrogen, which under this hypothesis contains argon impurities. Argon is introduced into the nitrogen purge gas either through leaks in the purge gas plumbing, or through commercially-supplied bottled nitrogen, diffuses through materials in the detector housings, and then comes in direct contact with the DAMA/LIBRA detectors. These argon impurities can then lead to a modulating 2.8 keV background under two scenarios. Scenario 1): These impurities include the isotope 37Ar, which decays by electron capture, emitting a 2.8 keV x-ray. These decays appear as single-site, monoenergetic events in DAMA/LIBRA, and produce an annual modulation due to the variation of neutron flux in the atmosphere and at the Earth's surface, which in turn leads to a seasonal variation in 37Ar production from the reactions 40Ca(n,α)37Ar and 36Ar(n,γ)37Ar. Scenario 2): Radon is also in the DAMA/LIBRA purge gas, modulating seasonally at a rate below the current DAMA/LIBRA limits. When radon or its short-lived daughters decay, the resulting beta, gamma, and bremsstrahlung radiation cause stable 40Ar to be ionized within the copper housings surrounding the NaI(Tl) detectors, resulting in characteristic 2.8 keV x-rays. Modulating backgrounds might also result from radon-induced neutron or gamma-ray flux from the surrounding cavern, leading to a small modulating background enhanced at low energy by the presence of 40Ar within the copper housings. These two scenarios are straightforward to test through assay of the purge gas as well as Monte Carlo and laboratory study of the DAMA/LIBRA copper housings when excited by ionizing radiation.


I had the same reaction when reading that. I'm frankly still confused. Isn't the difference in radioactive decay rate only in the atmosphere where it can be effected by ionizing radiation? Not in a sealed device under a mountain?


In the direct-search-for-dark-matter-experimental-community the DAMA results have long been excluded and therefore ~discredited (see lots of papers from LUX/XENON/PANDAX/CDMS/etc).

But theoretician's have to keep beating a dead horse to keep themselves relevant (i.e. publish papers, get PR, and get funding). Kudos for the PR they managed out of this but it's a yawn for the people actually hunting for DM.


Am I missing something, or does this seasonal variation imply that the "sea" of dark matter WIMP is stationary in the galaxy (which would cause the motion of the sun around the galaxy to look like "wind").

But actually, if there were WIMP's in the galaxy, wouldn't they orbit in precisely the same way as the Sun, leaving no relative motion? And thus no seasonal variation?

There would not even be daily motion, since it's assumed dark matter is cold, and thus the sun would capture them and they would orbit the sun in the same way as the Earth.


The disc of the Galaxy forms due to the self-interactions ("frictional forces") of ordinary matter. WIMPs shouldn't have those forces and therefore the dark matter halo is probably spherical. If we assume that the dark matter and ordinary matter started with the same angular momentum, then the rotational velocity of stars in the disc should be much faster than the WIMP halo.


The only frictional forces of ordinary matter that matter on a galactic scale are gravity. And that would affect WIMPs equally.

> If we assume that the dark matter and ordinary matter started with the same angular momentum, then the rotational velocity of stars in the disc should be much faster than the WIMP halo.

I don't see how this can be the case. The orbital period of both would have to be identical in order to be in the same place.


Normal matter loses energy through self-interactions (collisions), which is radiated away. Because the emission is isotropic, angular momentum is conserved even though the total energy isn't. This tends to make the normal matter collapse into a disk, which must rotate faster than the original halo to conserve angular momentum. Dark matter, which hasn't dissipated any energy (gravity is conservative), can move around more, never decaying into the orbits that the normal matter does.


> never decaying into the orbits that the normal matter does.

Exactly my point - it's either in the same orbit as the sun, in which case the orbital speed is the same, or it's not, in which case there's no wind interacting with the detector.

You can't have it both ways.


The dark matter is a gas not a solid. It's not orbiting the sun, it's orbiting the galaxy (technically, not even the galaxy, it's orbiting inside it's own halo mostly). The orbits of the particles would be highly irregular, going in all directions, many with very eccentric orbits, etc. The sun is moving through the "rest frame" of the galaxy at a ~constant speed, but the earth is orbiting the sun so it has fractionally varying relative speed compared to the rest frame (speeding up and down every year relative to the rest frame).


Is this a good summary:

Results of current dark matter (effect) detection experiments have yielded different than expected results.

These results don't seem to fit any current single theory about dark matter* (it would be nice if there was a nice short enumeration of theories and WHY they didn't fit with the combined set of observations).

Therefore we should spend more money, building more and different detectors in different areas run by different groups, and collectively gather more and varied data so that 'we' can attain a more clear understanding.

--

However, aside from some of the changes in 'sensitivity' for existing detectors yielding *"new data" (unexpected results) (this seems like low-hanging, confirmation of theory, modification of existing experiments). Lacking from the article is a proposal of what, in theory, the new detectors might detect and /why/ they might be detecting that, or how different outcomes from the proposed experiment might provide support for or against current theories.


I think the bigger part of the story is that the criticism involves a plausible scenario for terrestrial contamination that requires no new physics at all. Needless to say that needs to be ruled out before we spend more time than necessary pontificating about dark matter.


Yes, I see the trouble. The team "strongly arguing" that they have found the source of the signal while not sharing their raw data.

Evidence for dark matter is a big deal so who wants to give away credit for that? Or, who wants to share inconsistent data when the results are literally earth shattering? Visible matter interacting with some invisible forces only at certain conditions! Sensational!

Perhaps the Aether Hypothesis is true after all?


Personally, I think any experiment that publishes plots and not actual raw data should be defunded and decommissioned immediately.


The raw data is close to useless unless you are an expert on the detector which took the data.


I agree with this though opening up to external scrutiny (up to a reasonable extent similar to - say - CERN) would help dispel a feeling of “yet another faster-than-light type experiment”. I also don’t think that the “combativeness” with which criticisms of the experiment are addressed (cf article) is in any way useful to the scientific endeavour.


So you think the LHC should "publish" 100 petabytes of data?

What you are stating isn't practical because of cost.

But I'll definitely go with the idea that "if you want to make a claim as big as finding dark matter and be believed, then releasing your data is probably a good idea".


http://opendata.cern.ch

> "Explore more than 1 petabyte of LHC data!"

And that's just the online free-for-all version. If you apply for access, you can analyse all of it through the LHC Computing Grid.


This is awesome. But it is still impractical. The researchers knee deep in this already have to fight for the computing power and data access (and error checking of all of that) for years to get anything meaningful to come out. It's just not as simple as "publish the data". I'm willing to be proved wrong on this, but it's a lot like saying: here's an iOS IDE and a server, recreate Whatsapp. It's impractical for many reasons, but I'll definitely give that with enough resources (money/time/brainpower/data access) it is not impossible.


It's worse ! The data for the Higgs can't be published because... it was destroyed!

The aggregate, processed signals are all that is retained in the LHC, the raw data was gone before it could be analysed.

Also they used a bloody hokey boosted classifier for the detection but that's bye the bye now apparently. And there were 12 events out of about 1 trillion, so all good there too...


That is not true, at least for CMS. All the RAW data taken in pp collisions during 2011 and 2012 - ie the Higgs boson discovery dataset - has been saved in tape. As a general rule, we never delete RAW collision data that actually make it into permanent storage. Of course, data that didn't pass the real-time selection to be recorded in the first place is irrevocably lost.


So all the data apart from the data that was thrown away?


Also very true. Unfortunately this data is dirty, context dependent, or plain missing. This is true of pretty much any "real" experiment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: