Hacker News new | past | comments | ask | show | jobs | submit login

People use it to get facts, and trust it more than Google. I have some tech illiterate boss that asked me to do stuff some way because "ChatGPT said so", instead of trusting me, an experience professional. It wasn't like this with a Google search, so why now ? Natural language has a big impact on how the product is perceived



We've seen numerous stories at this point about lawyers trusting AI to generate case documents that turned out to have false cites - AI generated scientific papers are being published. Doctors are using AI. Law enforcement is using AI. Everyone is using it and a lot of people are using it with the assumption that it's intelligent and factual. That it works like the computer from Star Trek. People on this very forum who should know better have said they trust AI more than they trust themselves and more than other people.

AI probably has a niche where it's useful, but because it smells like a magic money machine that will allow managers to replace employees and create value from essentially nothing, modern capitalism dictates we must optimize our entire economy around it, no holds barred, damn the torpedos and full speed ahead because "money." I just hope the fever breaks before people start getting killed.


Note that there’s a big difference between AI and LLMs. There are plenty reliable techniques (by for example providing confidence estimations) in the AI toolbox. It’s just that LLMs aren’t one of them.


It turns out in a lot of (low skilled) knowledge work the nonsense that AI spits out is superior to humans.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: