Hacker News new | past | comments | ask | show | jobs | submit login

ChatGPT does surprisingly well against information hallucination. This is my first try:

> Me: Which city in Illinois was Barack Obama born in?

> Bot: Barack Obama was born in Honolulu, Hawaii on August 4, 1961. He was not born in Illinois.




That’s not a very good test of hallucination because there’s likely to be a sentence on the internet (and thus its training set) which answers that question directly. If you want to test hallucination you need to ask it a novel question or something contingent on the context of the question.

In my case GPT3 simply voluntarily failed the hallucination test but that’s kind of like crashing its car in the DMV parking lot. I didn’t need to go further.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: