Hacker News new | past | comments | ask | show | jobs | submit login

Ok top of it not having "motivation" to communicate, it has literally nothing to be communicated in the first place.

That's the key difference. We use language to express conceptualizations. We have some kind of abstract model somewhere that we are translating.

Maybe it isn't a cohesive model either. All I can say for certain is that - whatever it is - we are expressing it.

GPT does not express. It parrots. There is no conceptualization.




The more experience I get, the more I wonder if this is really the case for us. We certainly have some kind of abstract model in our heads when thinking deeply about a problem. But in many settings - in a work meeting, or socially with friends - I think it is a much more automatic process. The satisfaction you get when saying the right thing, the dread when you say something stupid: It is just like playing a game. Maybe the old philosophical concept of society as merely "language games" is correct after all. A bit silly but I find the thought makes annoying meetings a bit more bearable.

But you are of course right with GPT, it has no inner life and only parrots. It completely lacks something like an inner state, an existence outside of the brief moment it is invoked, or anything like reflection. Reminds me of the novel "Blindsight" (which I actually haven't read yet, but heard good things about!) where there are beings that are intelligent, but not conscious.


Intelligent but not conscious would still be a few steps ahead of GPT.

We can take a concept and refactor it symbolically. GPT can't do that. All it does is find symbols that are semantically close to other symbols.


I’m not sure that those two processes are as distinct as you believe them to be.


You seem very sure they aren't, yet you have no evidence apart from your own belief that you might be correct.

That's circular reasoning.


Yup.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: