> They're just system 1 thinking, the equivalent of putting a gun to someone's head and asking them to solve a large division in 2 seconds.
No, it's the equivalent of putting a gun to someone's head and asking them "what are my intentions?" Which is readily available to any being with a theory of mind.
I don’t think LLMs have theory of mind, but your point is not very strong. You can literally query ChatGPT right now and see that it can figure out intentions (both superficial and deep) of a gun is held to a head quite easily.
Because, obviously, training data probably includes a decent amount of motivation breakdowns as a function of coercion.
The division problem is arbitrary and irrelevant to any notion of theory of mind. If you have a better example, you can feel free to offer one. I already did so above.
you :"Which is readily available to any being with a theory of mind."
I'm saying humans do not have this ability. We don't have a valid proven theory of mind, and even if we did, that knowledge would not change how our brains process inputs, and somehow give us access to how we processed an answer. We don't know our 'intention'
I see the confusion now. Theory of mind is about recognizing the existence of other minds, not about being able to plumb the depths of one's own mind. If anyone had the latter skill, psychotherapy wouldn't need to exist and no one would ever have heard of Freud.
No, it's the equivalent of putting a gun to someone's head and asking them "what are my intentions?" Which is readily available to any being with a theory of mind.