Hacker News new | past | comments | ask | show | jobs | submit login

> They're just system 1 thinking, the equivalent of putting a gun to someone's head and asking them to solve a large division in 2 seconds.

No, it's the equivalent of putting a gun to someone's head and asking them "what are my intentions?" Which is readily available to any being with a theory of mind.




I don’t think LLMs have theory of mind, but your point is not very strong. You can literally query ChatGPT right now and see that it can figure out intentions (both superficial and deep) of a gun is held to a head quite easily.

Because, obviously, training data probably includes a decent amount of motivation breakdowns as a function of coercion.

It doesn’t know why, but it knows what to say.


LLMs don't fail those kind of tasks though. 4 is very good at keeping track of who knows what and why in a story. You can test this yourself.


Don't think so.

Put gun to persons ahead.

Ask them to do a division.

Then screaming at them "HOW DID YOU DO THAT, TELL ME NOW, OR YOU'RE TOAST".

Even most humans would splutter and not be able to answer.


The division problem is arbitrary and irrelevant to any notion of theory of mind. If you have a better example, you can feel free to offer one. I already did so above.


The point is not about the example of division, it is about 'tell me your intentions'.

A human also can't explain how their neurons calculated a division, or anything else, or 'intention', even if a gun is pointed at them.

So, why is that a fault of AI if it can't explain how itself is working? It is not a requirement for AGI.


Every point you made here either reiterates something I've already said or misinterprets the nature of the discussion altogether.


Maybe,

you :"Which is readily available to any being with a theory of mind."

I'm saying humans do not have this ability. We don't have a valid proven theory of mind, and even if we did, that knowledge would not change how our brains process inputs, and somehow give us access to how we processed an answer. We don't know our 'intention'


I see the confusion now. Theory of mind is about recognizing the existence of other minds, not about being able to plumb the depths of one's own mind. If anyone had the latter skill, psychotherapy wouldn't need to exist and no one would ever have heard of Freud.


What? My answer would be "I have no idea!"

Are they faking it? Will they murder me? Are they just trying to scare me?

I don't understand the point you're trying to make.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: