Hacker News new | past | comments | ask | show | jobs | submit login

I treat them as always hallucinating and it just so happens that, by accident, they sometimes produce results that resemble intentional, considered or otherwise veritable to the human observer. The accident is sometimes of a high probability, but still an accident. Humans are similar, but we are the standard for what’s to be considered a hallucination, clinically speaking. For us, a hallucination is a cause of concern. For llms, it’s just one of all the possible results. Monkeys bashing on a typewriter. This difference is an essential one, imo.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: