Hacker News new | past | comments | ask | show | jobs | submit login
Over the past 10 years, the LHC has found more than 50 new hadrons (home.cern)
140 points by graderjs on May 19, 2021 | hide | past | favorite | 55 comments



I wonder if this is like ancient astronomers finding more and more epicycles.


Nah, they're more like elements, but they're made of quarks not nucleons (protons/neutrons).

I think the modern equivalent of epicycles is string theory. Find an inconstancy? No problem, add a dimension.


> I think the modern equivalent of epicycles is string theory. Find an inconstancy? No problem, add a dimension.

A very common trick is to bump things to higher energy/mass. Your expensive collider didn't find the particle I predicted? The particle's mass must be larger than I thought; probably just beyond your collider's ability. You'd better build a bigger collider!

My theory of dark-matter/dark-energy/quantum-gravity/etc. doesn't fit known observations? This tweaked version only has discrepancies above a certain energy, which coincidentally is just beyond current colliders.

This is arguably how we found the W, Z and Higgs bosons; the charm, top and bottom quarks; and the tau. On the other hand, supersymmetry has been playing this game for decades, claiming that each new collider was sure to find (or rule out) the plethora of particles it predicted. Each time, the mass estimates were revised upwards.


The physicists knew all along what the potential mass range was. They hope they can find it at the lower range, because that's what's cheaper, as well as feasible with the engineering capability.

Arguably, the biggest problem with the Higgs boson is that it was found exactly where they were expecting it. They'd ruled out other ranges; anything bigger would no longer be a Higgs boson. So all it did was tell them what they already knew. Failing to find it would actually have been more interesting -- though finding something different would have been even more interesting.

So they don't really say "we're sure to find it". They're hoping to find it -- and hoping not to.

They'll always need a bigger collider. That's where new results will always be. They're hoping for results they can't predict, and that makes it really hard to write a justification for the expense.

(I personally would just as soon spend much less on Big Science like that, but you'll find a ton of people willing to defend it. Fundamental physics has a history of producing results that are unexpected but economically immense, from lasers to the Web, as well as driving human intellectual curiosity.)


> They'll always need a bigger collider. That's where new results will always be.

Bigger colliders are a low-risk approach: we'll definitely learn something, even if it's just better constraints on our models. This was a no-brainer back when colliders took up rooms or buildings. We've gone so far down this path that colliders are now measured in kilometres (and straddle national borders!), making it harder to justify the expense.

Higher-risk approaches can be much cheaper, by looking for new paths (where we haven't hit diminishing returns yet). EmDrive is an example; wakefield accelerators are another. These have a greater chance of going nowhere (again, EmDrive), but cost so little that we can fund a whole bunch for the equivalent of a bigger collider.

Of course, in best approach is to strike a balance between these two extremes: don't spend everything on the biggest possible collider; but likewise don't avoid projects just for being expensive.

Another good example is ITER versus the various fusion startups.


That's not quite true. The dimension of the ambient space is constrained to be exactly 10 (in strings, 11 in M-theory) by consistency considerations.

The important question is what is the geometry of the extra dimensions, if they exist. You need a large amount of energy to probe the extra dimensions, so it would appear just like some extra particles in the theory to a low energy observer. (Of course, the observer has to have enough energy to find even those particles).

The standard model also has seemingly arbitrary particle content and interactions without much to constrain it. A majority of the particle content was put in the theory by hand after seeing a failure of the theory. In strings, this choice has been moved from the theory to the initial data (choice of geometry of the extra dimensions).


I'm not a theoretical physicist and only took a few math courses beyond which were required by my major, but it was my understanding that mainstream string theories have 10, 11, and 26 dimensions, but I recall, and could be wrong, that there can be infinitely many string theories with 4 dimensions being the lower bound and some unspecified upper bound.

But I'm not an expert and I was mostly being hyperbolic with my original reply. Corrections are always welcome.


26 is for theories with bosons only. If you want fermions you need 10; 11 in M-theory as GP suggested.


Worth noting that those ancient astronomers weren't wrong so to speak: each additional epicycle predicted the data better. What they were doing wasn't fraud the way that, say, dowsing would be. Rather, they hadn't made the mental and cultural leap to the the much simpler model that made epicycles (almost always) obsolete.

I think it's fair to say that physicists are well aware of the fact that this could be like finding more and more epicycles. In the sense that they're hoping they one day find a much simpler model to explain all this stuff than a big list of particles without a simpler theory behind all of it.


Not especially, no.

This is more akin to finding new molecules (hadrons), given known kinds of atoms (quarks).



Always nice to hear some Kirsty Hawkshaw, her two Opus III albums are some of my favorite music to program to.


Fifteen hundred and seven.

<3


Biggest crash in history.


Front page New York Times, August 10th, 1988... I thought you was black, man!


Could it be that string theory is correct, insofar as the universe is written in spaghetti code?


Seeing such number (and knowing nil about particle physics), I imagine someone cataloging the fallen leaves in autumn using low-res pictures. They are the same kind of leaf, but from different angles it looks like there's several kinds.


Hadrons are composite particles so that’s exactly what this is. Different quark combinations. Back in the day people didn’t know and we’re quite unhappy with such a Zoo of particles.


This might be an extremely laymen-level question, but have any discoveries from the LHC trickled down into consumer/commercial/industrial level technologies so far?


The information sharing tools they developed back at the start got some popularity.

I imagine you're wondering about newer things and actual discoveries. I just couldn't resist the setup.


Lots of the designs used by CERN are basically open source hardware some more useful than other depending on how niche it is. Heck CERN helps with KiCADs development which has benefited numerous hobbyists.

Like here is a voltmetere someone built using designs from CERN. https://m.youtube.com/watch?v=D28uSzCs7-k


That was an extremely satisfying video to watch. Thanks for that!


As others pointed out, the work done to build and support the actual machine has led to a lot of positive contributions back to society.

However you asked for discoveries, so I imagine you were thinking of the stuff the LHC has been searching for.

And I think in that regard it's safe to say no. And from what I understand, there's not a huge expectation for that to happen. The reason being that the LHC is a high-energy particle physics search, while almost everything we make or use is made of condensed matter[1] and low-energy particles, an area where we know pretty good how individual particles behave.

In the condensed matter field though, where many-body particle interactions dominate, there's a lot of interesting work that might have applications. As an example, a relatively recent discovery is superconductivity in twisted graphene layers[2].

I still think LHC and similar should be built. After all, a lot of good comes out of the technology required to build and operate them, and there is a chance we'll learn something with significant impact.

[1]: https://en.wikipedia.org/wiki/Condensed_matter_physics

[2]: https://www.quantamagazine.org/how-twisted-graphene-became-t...


How much of their research occurs naturally versus through man-made experimentation? Which is to say, do their collisions they cause occur without any man-made instrumentation?


From what I've gathered, cosmic rays hitting[1] the atmosphere produce the same or similar collisions that the LHC and similar accelerators do.

These collisions can be detected, and properties measured[2].

The main difference is that with accelerators like the LHC you control exactly where and when collisions happen, making precise measurements of a vast number of events possible. If I did my math right the Higgs required over 10^15 collisions before detection was claimed.

[1]: https://en.wikipedia.org/wiki/Air_shower_%28physics%29

[2]: https://en.wikipedia.org/wiki/Cosmic_ray#Detection_methods


This is reminiscent of Seaborg's work in the 1960's that lead to the discovery of lots of new heavy elements. Everyone was pretty sure they could be created but someone had to build an accelerator to do it.


There's a hypothesis that we'll eventually find more stable superheavy elements as the number of neutrons increases in the nucleus [0]. Perhaps we'll find something like that in hadrons (not counting protons and neutrons of course)?

[0]: https://en.wikipedia.org/wiki/Island_of_stability


Here's an ignorant question, but my understanding is that every element has an emission/absorption spectra. That being the case, why not point a telescope at some supernova remnant and look for any lines that do not match the known elements. If stable super-heavy elements exist, then they would probably be made in supernovae and the remnant would show signs of those elements, no?


I think the main issue it's that they would need to be made in sufficient quantity to show up in the Spectra of the novas and collision. The heavier they are the fewer will be made, making detection harder and harder. You'd also need to calculate all the lines for a given element at the various energy levels to determine if a set of lines match, and that gets nontrivial pretty quickly


I think that that issue is secondary to the fact that the island of stability has predicted half-lives in the range of a year or two. So in order to look for that spectrum we have to have supernovae that have happened very recently and close enough for us to make out the spectrum in what remains.


I hadn't seen anything saying a year or two, I think I've only seen predictions on the orders of hours to days which would basically make it impossible to detect on anything that could be reasonably close to us since they'd be around for such a short time and only during/after the brightest part of the whole thing.


Ah, that makes sense. Thank you.


Not answering your question, but fun fact: helium was discovered by seeing an unexplained emission line in our own sun’s spectrum, which is where it got its name.


The most common theory is that these super heavy elements would more stable, but nevertheless unstable in absolute terms. There are predictions that the half lifes would be on the order of minutes or days. However, all of this is obviously highly speculative.


Looking forward to the stable isotope of element 115 that serendipitously falls into that island of stability.


Why element 115 in particular?


They are probably making a joke related to UFOs. There's a guy named Bob Lazar who is part of alien/UFO lore that claimed he worked at Area 51 and the alien craft they had there used a stable isotope of element 115 for power or warp (or something like that).


My dyslexia had me thinking this was a sexual thing...


Am I the only one who misread the word "Hadrons"?


The FCC [0] is looking at 100TeV vs the current LHC 14TeV. Is it a big enough improvement to find anything new? I feel like CERN (LHC) is becoming like Nasa, funding massive job projects (space shuttle) that can't compete with a motivated private enterprise or other disruptor (SpaceX) that works out what is needed to make real progress (in SpaceX's case real reusability and methalox fuel).

[0] http://fcc-cdr.web.cern.ch/


In defense of government space agencies, Nasa operates on a wholly different regulatory framework[0]

They can't do as they please on hiring, subcontracting, suppliers because that money has oversight from treasury and congress. It's taxpayer money. Its like working with one hand tied behind your back. Not to mention that if a political mandate comes through, and it says stop all work on the space shuttle, no matter what progress you've made or what new materials are coming out or whether computer simulations are an order of magnitude better - you have to stop.

SpaceX is merely a launch alternative. The shuttle program has already paid its dues and paved the road.

[0](https://www.nasa.gov/emd/policy-regulations-and-guidance)


Horray. How about figuring out a way to lower the hadrons discovered per dollar ratio and maybe i'll be more excited


> ... per dollar ratio ...

LHC - Large Hadron Collider - Geneva, Switzerland - Most likely CHF/EUR. Not sure why you would care how much is being spent on it.


>Not sure why you would care how much is being spent on it.

It's easiest way to see how "big" something is


Hm, you could also just look at the Wikipedia page, it has all the measurements you're looking for most likely: https://en.wikipedia.org/wiki/Large_Hadron_Collider


km, kg (or if you prefer MeV/c^2) work just fine.


by "big" I didn't only mean physical size, the amount of effort needed / similar too

e.g quant computer


If it was the military or police, would you suddenly care about what their budget was?


My point was that the comment I replied to is using USD while LHC most likely do their budget in EUR or even Franc. So clear indication that the author of said comment probably lives in the US, who I don't think is funding the LHC.

But to answer your question, I absolutely care about the budgets for the military or the police, or any entity really, as long as it belongs to my country.

What the US spends it's money on I couldn't care less, although it sucks the government there is so reluctant to actually pay for services to the humans living there and instead spend the money on military. But in the end, not my business so I don't really care either way.


Why? Are you waiting for hadrons to be affordable for consumers or something?


better to spend that money on, what, plastic widgets and more manipulative ads?


actually the LHC is sub-$10B endeavor, and it is almost pocket money in the world of manipulative ads. Google's profit (not revenue) is $17B/quarter - i.e. Google can build 2 LHC per quarter.


It's been going that way since Gutenberg.

My pet theory is that democracy is a subproduct of soap advertisement.


LHC spend probably qualifies for R&D tax incentives, which might end up getting them more than 2 per quarter.


I think Google's ad R&D also qualifies for those. It's part of the reason Amazon historically pays no taxes (the other one being that Amazon.com is barely profitable.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: