It is just a convention, specifically for interpreting and presenting experimental results. We also use sigma to represent standard deviation in other contexts, of course. Sometimes it represents Pauli spinor matrices. Sometimes it's an index for spacetime tensors.
Life would be hell for any practitioner without single-letter abbreviations. In fact, we like them so much, that's why we adopted the greek letters (we ran out of alphabet). And, for better or for worse, convention runs deep in scientific literature. In practice it reduces a lot of redundancy, makes it more efficient for researchers to skim and understand results. But the cost is a years-long learning curve to break into any scientific field's literature.
FWIW, the linked article is from the journal Science, which is a technical publication. Often "sigma" is omitted in sci-comm articles, or at least is translated for the reader. They will say something like "there is a one in X million chance this is a fluke".
Looking up from my screen filled with sanity saving conveniences like having to type /sigma to get a really smart looking lower case greek character to display so the masses can't make sense of my math.
Life would be hell for any practitioner without single-letter abbreviations. In fact, we like them so much, that's why we adopted the greek letters (we ran out of alphabet). And, for better or for worse, convention runs deep in scientific literature. In practice it reduces a lot of redundancy, makes it more efficient for researchers to skim and understand results. But the cost is a years-long learning curve to break into any scientific field's literature.
FWIW, the linked article is from the journal Science, which is a technical publication. Often "sigma" is omitted in sci-comm articles, or at least is translated for the reader. They will say something like "there is a one in X million chance this is a fluke".