Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
elitan
8 months ago
|
parent
|
context
|
favorite
| on:
Legal models hallucinate in 1 out of 6 (or more) b...
What's the same rate for humans?
Gormo
8 months ago
[–]
LLMs are only capable of hallucinating, whereas humans are capable of hallucinating, but are also capable of empirically observing reality. So whatever the rate is, it's necessarily lower than those for LLMs.
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: