Hacker News new | past | comments | ask | show | jobs | submit login
Paging Dr. Bot – Why algorithms cannot replace intuition in medicine (hedgehogreview.com)
1 point by pseudolus on Sept 8, 2023 | hide | past | favorite | 5 comments



Intution is just another way of saying "I like my anecdota and pretend it's statistically significantly better than guessing"


When you say "better than guessing", what exactly do you mean? Because there are certainly many cases where an expert's guess is going to be much better than a uniform sampling of the possibility space.


Yes, I agree many expert's informed (not intuitive) opinions are going to be better than uniform sampling. That's not what intuition is: intuition is the ability to make a prediction without thinking about it or using reason. But in practice it's really more a statement about one's inability to recognize that one's own mind did think, or did reason, but in a way that didn't appear to the conscious mind.


What an unbelievably pretentious thing to say


Possibly; some true statements sound pretentious.

But what's really pretentious is the author of this article, who cites their silly anecdote to "prove" that AI can't possibly be as good as a doctor's intuition.

"""The basic defect of artificial intelligence follows from this principle. Using descriptive sketches of mental status drawn from reference books, journal articles, and tests, AI would have got my elderly Lebanese patient all wrong. She knew the French kings and their mistresses, something she learned decades ago perusing her family’s library, but not the name of the recently elected American president. Since knowing the president’s name is standard on most tests of mental status, AI would have mistakenly judged her to have an “intact long-term memory with a loss of short-term memory,” and awarded her the ridiculous diagnosis of “altered mental status,” or even senile dementia. It is why AI cannot replace doctors or other professionals who rely to some degree on intuition. Doing so courts ridiculous outcomes. Indeed, AI is becoming known for its stupid mistakes as much as for its triumphs. Even researchers admit that AI lacks “common sense.”"""

My guess is any true machine learning model for mental status would not make a classification based on a single, absolute data point (inability to state the current, or past president). I would also say that a patient who is unable to state the current president but can list all the French kings and their mistresses is not drawn from a typical distribution- and that may indeed indicate something about their mental status.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: