Maybe, but symbolic thought can get pretty far away from what we generally call "language." I bet you can reason 1+3x=22 pretty easily without any words whatsoever, or the sound of one ascending octave after another, or the approximate G-force induced on your body if you take the next turn without applying the brakes.
All of these forms of reasoning are true and useful calculations: when we talk about "intuition" what we usually mean is that we have a lot of experience and internal reasoning about a subject, but we struggle to translate it to and from the "language" part of our brain. Nonetheless, any social dancer will tell you that a dialog is possible just by receiving and inducing g-forces alone. You can reason this way about abstract concepts like orbits without ever touching a single word or concrete symbol.
Edit: the key aspect of reasoning, imho, is the ability to make predictions and introspect them against a database of other predictions, using an adversarial heuristic to weight the most plausibly useful results. Perhaps our pattern matching AIs of today just lack sufficient "experience" to do what we call reasoning.
>I bet you can reason 1+3x=22 pretty easily without any words whatsoever
I've tried to do it, but I can't. I had to do something like "ok, so we subtract one from both sides and then it's easy, 3*7=21". Maybe I could do 2+8 but I still think the word ten "aloud".
I was able to do it with no words. I 'saw' the steps as if on a piece of paper. I saw 3x=22-1=21, then x=21/3=7. But I have a degree in applied math. Perhaps not internally vocalizing is just being extremely familiar. It also happened very quickly, perhaps there was no time to vocalize anyways.
To be fair, math is a language in itself... with many dialects come to tcreditors.
At the end of the day though, thought requires communication, even if internal. Even physics is modelled as some sort of 'message passing' when we try to unravel what causality really is. Similar to how a processor has cycles, I think/know similar (but unsynced) happens as part of what we call 'thinking'.
Most people can't do 1 + 3x = 22 without any words or symbols. People who can don't realize that most people can't. I'd argue one isn't using logic when they do that, it's just very good pattern matching.
It's also possible to do mentally by visualizing it rather than internal monologue. You can imagine the 1 on the left arcing over to the right, cancelling the 22 to 21, then the 3 moving under the 21 and the 21 descending through the 3 to become 7.
yup. I considered myself an /extremely/ verbal person when reasoning, but what I do with the above feels closest to 'moving the 1', almost like balancing a mental scale.
I never really noticed that before. I'm not great at math, fwiw.
Regarding “1+3x=22”, I’m actually not sure, the number words certainly appear in my head when solving the equation. But even then, I would count “1+3x=22” as constituting language. Perception of sound, G-forces, and dancing don’t perform type-2 reasoning by themselves, so I don’t think your argument applies there.
Regarding your edit, no, I think the key aspect of the kind of reasoning we are missing in current AI is the ability to hold the reasoning in your mind, and to iterate on it and evaluate it (judge it) within your mind.
It is very difficult to have a discussion using words to discuss the semantics of non-word or non-symbolic semantics. I was pointing at several different plausible spaces for semiotics and how these spaces could be spaces for reasoning in the hopes that one of them might be relatable.
If you use words in your mind when you use math, and you use words in your mind when you make or listen to music, etc., then it is very difficult to find a common ground where it is possible to see that these other realms of thought are capable of not only prediction, but also producing evidence that leads to judgement. That is to say, the key aspects of "reasoning." I picked them because I thought they had broad enough appeal to be relatable, and because I do not personally hear words in my head when doing any of these activities, whether it's calculus or tango, but I still find calculus and tango to be places where reasoning occurs.
Some of them, like math or music, are closer to the kind of symbolic thought we use when we discuss things with words. Others, like the experience of g-forces, are not. I present them as a sliding scale between "word based" reasoning and "non-linguistic" reasoning. Perhaps you can think of a realm that better fits for your personal experience of intuition, and inspect whether these intuitions are capable of "real" reasoning in the absence of language, or whether intuition should never be trusted even when you have a great deal of experience in that area. Perhaps in your estimation, anything that cannot produce evidence that is articulable in word form is suspect.
Personally, I find all these methods, including language, to be suspect. I don't find language to be especially better at producing the kind of evidence for prediction, correct judgement, or discourse for reasoning than other methods, unless you reduce "reasoning" to tautologically require it.
One of the best tools of language is that we have writing that allows easy inspection or iteration of the written content; but these things are possible in other realms, too, it's just that we didn't have great tools for introspecting and iterating on their "ideas" except within our own minds. These days, those tools are readily available in many more realms of human insight.
i feel it is worth pointing out, as another commenter highlighted, language and even symbolic more abstract languages, bring about fluency if you've practiced "speaking and writing" it enough.
i think native speakers hardly "think" about the steps necessary to form a grammatically correct expression, and most of the time just "know".
fluency is not the same as lacking an internal framework for interpreting or thinking in terms of, symbols.
All of these forms of reasoning are true and useful calculations: when we talk about "intuition" what we usually mean is that we have a lot of experience and internal reasoning about a subject, but we struggle to translate it to and from the "language" part of our brain. Nonetheless, any social dancer will tell you that a dialog is possible just by receiving and inducing g-forces alone. You can reason this way about abstract concepts like orbits without ever touching a single word or concrete symbol.
Edit: the key aspect of reasoning, imho, is the ability to make predictions and introspect them against a database of other predictions, using an adversarial heuristic to weight the most plausibly useful results. Perhaps our pattern matching AIs of today just lack sufficient "experience" to do what we call reasoning.