Hacker News new | past | comments | ask | show | jobs | submit login

Is it surprising that a system that was exclusively trained on emitted language will only do well on language?

I don't see how I can extrapolate to anything beyond that. I certainly can't extrapolate that it would be unable to learn other tasks.




Everything is defined in language. As a system is able to learn from the corpus of texts, it also builds reasoning to a large extent. For example, GPT4 can guide you through solving a problem like building a flying and crawling drone for inspecting an attic. Or, watering plants. So this goes beyond language, unless I missed your point.


Karpathy explains it far better than I did in his "State of GPT" talk about a month ago: https://www.youtube.com/watch?v=bZQun8Y4L2A

Yes, language can do that, but books and other texts, on average, leave out a lot of steps that humans have learned to do naturally. They can be prompted to do them, and their answers improve, indicates that there is definitely some level of "reasoning" there.

But it has weird limitations in its current forms. I expect that to improve pretty quickly, tbh.


Not everything. For example, how did Wagner create his music? The ability to appreciate beauty stands above reasoning. But your point largely stands, as langugage processing alone allows to build a "linear thinking machine" with super human reasoning ability. It won't be able to make music or art, but it will create a spaceship and take science to the next level. As a side effect, such AI will erase all art, as art will look like noise in its "mind".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: