Ask chatgpt -> get advice-> double check on Google- > try advice -> my medical issue is solved
This has happened to me 3-4 times, hasn't sent me wrong yet. Meanwhile I've had doctors misdiagnose me or my wife a bunch of times in my life. Doctors may have more knowledge but they barely listen, and often don't keep up with the latest stuff.
This is one thing people who shit on chatbots don't get. It doesn't need to be god to be useful, it just needs to beat out a human who is bored, underpaid, and probably under qualified.
Hope it will never hallucinate on you. Doctors will start to need to warn against DoctorGPT MD now, not just Doctor Google MD.
Maybe I'm a low-risk guy but I would never follow a medical solution spit out by an LLC. First, I might be explaining myself badly, hiding important factors that a human person would spot immediately. Then, yeah, the hallucinations issue, and if I have to double check everything anyway, well, just trust a (good) professional.
This has happened to me 3-4 times, hasn't sent me wrong yet. Meanwhile I've had doctors misdiagnose me or my wife a bunch of times in my life. Doctors may have more knowledge but they barely listen, and often don't keep up with the latest stuff.