Hacker News new | past | comments | ask | show | jobs | submit login

It's not being sold as long-form autofill. It's being sold as something which will respond to queries, and there is an expectation that the information it replies with should have some accuracy.

I am not in disagreement with your generalization of how it functions or that it lacks empirical knowledge. Some responses are accurate, some are not. It will always give a response but does not always produce a disclaimer saying that the response might have been inaccurate.

So, in criticism of the existing nomenclature used to describe this phenomena (hallucination), I simply suggested something more appropriate.




I agree that it's being positioned deceptively.

Today, it is just a novelty.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: