Hacker News new | past | comments | ask | show | jobs | submit login

Previously to GPT-2, I would have agreed. With that and GPT-3, I think most people from the 70s familiar with science fiction versions of AI would think this was pretty close to how they conceived of AI, at least until they were educated on the specifics of how it works.



I said this having GPT-3 in mind too. To be fair: my mind was absolutely blown when I first got to play with it, and I'm still deeply impressed by it. The experience altered my beliefs on human intelligence. I've always appreciated the joke that sometimes humans can be hard to distinguish from a Markov chain, but GPT-3 actually made me take this seriously - the quality of output on a 1-5 sentence range is astonishing in how natural it feels.

Still, it doesn't take more than a few sentences from GPT-3 to realize there's nothing on the other side. There's no spark of sapience, not even intelligence. There's the layer that can string together words, but there's no layer that can reflect on them and check whether they make any kind of sense.

To be fair, "sapience" may be a bit too high for the lower bound of what is an AI. I still think the bound is on that control / reflexivity layer. It's the problem that stifled old-school symbolic AIs. To this day, we can't even sketch an abstract, formal model for this[0]. Maybe DNNs are the way to go, maybe they aren't. But GPT-3 isn't even trying to solve that problem.

I'm not sure when I'll be ready to call something a proper AI, but I think it would first need to demonstrate some amount of on-line higher-level regulation and learning. I.e. not a pretrained, fixed model, but a system where I could tell it, "look, you're doing this wrong, you need to do [X]", and it'll be able to pick up on a refined pattern with a couple examples, not a couple hundred thousand. For this, the system would need to have some kind of model of concepts - it can't be just one big blob of matrix multiplications, with all conceptual structure flattened and smeared over every number.

--

[0] - I think the term of art here is "metacognition", but I'm not sure. It looks like what I'm thinking about, particularly metacognitive regulation.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: