I don't understand the particle physics they're talking about, but I do find it fascinating how a lot of the work is really how to sort through a massive amount of data to remove all the noise and find the signal. It sounds like they're using some machine learning algorithms to examine and classify various interactions in the data.
I wonder if the people doing it are trained in computer science or physics? Not that it should matter in the end results, just curious how people got there.
A lot of the modern research in 'big data' analysis is/was driven by physicists. Bayesian Inference is about trying to make a decision about what you can infer from an observation or series of observations, and the impetus for this came from trying to make sense of experimental results.
Two of the really great text books in the field are by physicists,
1) 'Information Theory, Inference and Learning Algorithms' by David Mackay, a physics professor at Cambridge. Perhaps the most readable and enjoyable text book I own. Certainly up there.
2) 'Pattern Recognition and Machine Learning' by Chris Bishop, now a director at Microsoft Research in Cambridge but formerly a physicist. Delightfully, under the circumstances, his PhD supervisor was Higgs (yes, the one of boson fame)!
They're probably physicists. One can take IT classes during the study, and some people have a very high skill level. Data interpretation is a big part of being an experimental physicist, and these algorithms are very useful, so people will seek them out.
Computer science people work in other areas, such as setting up and running the data collection and on-line processing. (A professor told us many interesting stories about the many Unix servers they build and the bugs they created..)
few of us are formally trained in CS. some of us are good. others are not.
my understanding is that the computer engineers at CERN are mostly tasked with IT work, the rest (including DAQ software/firmware, network code, distributed+realtime data processing, etc) is made by the physicists.