Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
alienicecream
53 days ago
|
parent
|
context
|
favorite
| on:
We gotta stop ignoring AI's hallucination problem
Intelligence is the ability to comprehend a state of affairs. The input and the output are secondary. What LLMs do is take the input and the output as primary and skip over the middle part, which is the important bit.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: