Hacker News new | past | comments | ask | show | jobs | submit login

> You are just an interconnected model of billions of neurons that has been trained on millions of facts created by other humans.

...but I didn't pop out of the womb that way, and as you said, over my lifetime I will read less than 1 millionth of the data that GPT-3 was trained on. GPT-2 had a better compression ratio than GPT-3, and I'm sure a GPT-4 will have a worse compression ratio than GPT-3 on the road we're on.

Rote memorization is hardly what I'd call intelligence. But that's what we're doing. If these things were becoming more intelligent over time, they'd need less training data per unit insight. This isn't a dismissal of the impressiveness of the algorithms, and I'm not suggesting the classic AI effect "changing the goalposts over time." I fundamentally believe we're kicking a goal in our own team's net. This is backwards.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: