Hacker News new | past | comments | ask | show | jobs | submit login

Can we please stop calling it "hallucination"? That is such a poor word choice it's not even remotely accurate. Hallucinations are perceptual experiences not grounded in reality. LLMs don't have perception and they most likely don't have experiences.

What LLMs are doing is bullshitting (or confabulation if you want to mind your language). Oh and by the way, humans do it all the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: