Hacker News new | past | comments | ask | show | jobs | submit login

Hallucinations exist because people are using LLMs as knowledge oracles which they are not. I doubt it'll ever be solved unless a new type of model is invented for this.



With current models it cannot be solved. Maybe it cannot be solved at all; humans lie/make up things (as in; they might not know they are telling a non truth) all the time and that's the best example of intelligence we have.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: