Hacker News new | past | comments | ask | show | jobs | submit login
A Wellness Chatbot Is Offline After Its ‘Harmful’ Focus on Weight Loss (nytimes.com)
6 points by amadeuspagel on June 9, 2023 | hide | past | favorite | 8 comments



Whoever thought LLM's were at a point where they could responsibly replace a mental health professional was at the very least incorrect.


It's true that LLMs cannot replace mental health professionals. It's also true that mental health professionals cannot replace LLMs (even within the realm of mental health).

I personally have a lot of resentment towards the therapeutic process. In short, the fact that the therapist wouldn't actually get involved in my life, wouldn't let down the boundaries, was quite painful. I understand there are many reasons for those boundaries to exist and I do not think they should change. I'm not proposing any changes to the therapy industry. I do believe the therapy industry should continue and believe it will help many people.

In short, what I am saying is that the therapy industry has one box, and not everyone is going to find what they need inside that one box, inside those boundaries. In the vast realm of mental health problems and needs, you can't tell me that one industry has the solution for every person and every problem.

I've found LLMs to be more helpful than my therapist was. There's no artificial boundaries. The LLM is a computer program and that's all I expect it to be. It's available 24/7. It is far more consistent than a human can ever be; it doesn't have good and bad days. The LLMs are helpful to me, but I'm also in a good enough head space that I wont be harmed if they malfunction. I do believe they can be dangerous, but so can a therapist.

I think there's a future for mental health focused LLMs, probably larger larger than you would think.

(Before disagreeing, remember that I have mostly focused on my own personal experience as someone who has gone to therapy. I encourage others to form their own opinions and if you believe therapy might help you, try it; it is safer than an LLM and has helped many people.)


Should be noted that this wasn’t an LLM, but rather a traditional “expert system” with canned messages and deterministic rules.


Silly ol AI, telling people the truth too bluntly. So many health issues go right back to overeating and too little exercise. Most of them for most of us?


I'm guessing you did not read past the headline. The criticism is valid, the chatbot was telling a confessed anorexic that the solution to her problems was to cut calories. That's definitely unhelpful.


Will AI ever do any better than to give the average advice? They suffer from a regression to the mean, giving average advice for the average person.

So how will they 'fix' it? Put in a hack to recognize this query and give some boilerplate answer?

I'd be more concerned if the person receiving the skewed advice didn't recognize it as wrong.

Anyway if this gets an AI taken down then they'll all come down. They all suffer from this.


In this case it wasn’t a LLM, or AI. It was a pretty dumb chat bot script.


So many health issues experienced by people with anorexia?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: