as ever, machine learning is not really suitable for safety/security critical systems / use cases without additional non-ML measures. it hasn’t been in the past, and i’ve seen zero evidence recently to back up any claim that it is.
I don't doubt the news article on this, but even with caffeine pills/powder it's near half a fistful to get to LD50 judging by my caffeine tablets. It's not impossible to consume, but it'd be distinctly unpleasant long before you get even anywhere close to dangerous levels.
For my high-caffeine pre-workout powder, I suspect I'd vomit long before I'd get anywhere near. Pure caffeine is less unpleasant, but still pretty awful, which I guess is why we don't see more deaths from it despite the widespread use.
I agree with you that there really ought to be caution around giving advice on safety-critical things, but this one really is right up there in freak accident territory, in the intersection of somewhat dangerous substances sold in a poorly regulated form (e.g. there's little reason for these to be sold as bulk powders instead of pressed into pills other than making people feel more macho downing awful tasting drinks instead of taking pills).
I wonder if they’re thinking 200mg per kilo to trigger cell death. I have trouble believing a human heart surviving a dose of 50mg/kg. Half of them surviving four times that much? No. I don’t believe it.
Found an article about a teenager who died after three strong beverages. The coroner is careful to point out that this was likely an underlying medical condition not the caffeine. The health professional they interviewed claims 10g is lethal for “most” people, which would be 100-150mg/kg. That still seems like something an ER doctor would roll their eyes at.
Your example doesn't interact with the chicken littling in this thread.
> The hearing was told the scales Mr Mansfield had used to measure the powder had a weighing range from two to 5,000 grams, whereas he was attempting to weigh a recommended dose of 60-300mg.
Nothing to do with an LLM nor with someone not knowing the exact LD50 of caffeine. Just "this article contains someone dying of caffeine overdose, and we're talking about caffeine overdose here, therefore LLM is dangerous."
> some people use caffeine powder / pills for gym stuff apparently.
At 200mg per pill, which is the strongest I had, I'd still have to down some 70+ pills in one go. Not strictly impossible, but not something you could possibly do by accident, and even for the purpose of early check-out, it wouldn't be my first choice.
An accident with it in powdered form is possible - people who use them are often used to pre-workout supplements tasting awful, and so might be prepared to down it as fast as possible - but it's a big enough volume of powder that it really is a freak accident.
And if on purpose, using caffeine would just be staggeringly awful...
the problem isn’t someone’s intent (on purpose/by accident).
it’s intent (want to improve my gym performance so down a bunch of caffeine) combined with incorrect information gained from what is supposedly a trustworthy source (the limit presented is much higher than it actually is for humans).
If they're searching for LD50, they're already setting themselves up for errors, even with the right information. The LD50 isn't a safe dose, after all, but the mean lethal dose. While it's not great if its wrong, if people search for an LD50 thinking it indicates what they can safely take, it's already going to be hard to protect them against themselves.
some people use caffeine powder / pills for gym stuff apparently.
someone overdosed and died after incorrectly weighing a bunch of powder.
doubt it is a big leap to someone dying because they were told the wrong limits by google.
https://www.bbc.co.uk/news/uk-wales-60570470
as ever, machine learning is not really suitable for safety/security critical systems / use cases without additional non-ML measures. it hasn’t been in the past, and i’ve seen zero evidence recently to back up any claim that it is.