The article attempts to relate several hypothetical anecdotes, informally, but is otherwise scant on actual evidence.
Here and there, are sprinkled vague nods towards unnamed "researchers" and "the new way" but not many specifics on who, when or how. Mostly it's just: Trust us, here's how it is now.
The article uses only one name, and only aims to chip away at it. It has no new celebrities, no names stepping forward to take credit for the incredible breakthroughs that destroy the old and oh-so-inferior.
The authors, and the editor are after something. But they offer nothing (or very little) in exchange, pre-supposing that it's just common sense which prevails over the tyrrany of something innately unlikable.
Not sure why linguistics ruffles some people's feathers. It's such a dry, imprecise sociological subject after all.
EDIT: they make a single mention, in passing, of research by Harvard psychologist Steven Pinker, but very briefly, and don't return to it afterward.
Even a single citation to a paper would have gone a long way.... sigh... From my perspective Universal Grammar is not even the biggest contribution to language acquisition that Chomsky made: it's the notion that the mechanism for language acquisition is the same in children as it is in adults. What's weird about this is that I have always been under the impression that Chomsky's opinion was that language acquisition was not a lock step uncovering of universal grammar, but that universal grammar was simply the best way to represent language. Indeed, the work that has been done showing that Krashen's natural order hypothesis is correct (that people acquire the grammar for a language in a certain particular order, independent of the order in which is is presented) give some credence to that notion. It doesn't necessarily follow that the brain actually holds a representation of universal grammar.
I came away from this article being confused as to whether I need to re-read Chomsky's work, or whether the author needs to do so.
> people acquire the grammar for a language in a certain particular order, independent of the order in which is is presented
I didn't know about this. It sure makes sense. Do you know of any institute, course, or individual that makes use of this fact in order to optimize for learning a new language?
I need to learn German, and let's just say that I find the traditional learning methods almost impossible. And I have been living in a German-speaking country for over 5 years now.
Like read books or have someone tutor me? Right now I spend zero time doing this. I just gave up. I tried but nothing stuck, and it was so difficult for me on an intellectual level that quite literally after one hour of study I was so mentally worn out that I could not do my job. I had to sleep through the rest of the day and I literally felt like I was going crazy.
I don't get it, I can learn (and remember) a new programming language and have some practical level of competency in hours (even with Haskell, actually, Haskell was one of the easier languages), but with real languages I am totally inapt.
In school I studied 8 years of French, two or three classes per week, and it was a real course, nothing leisurely, lots of homework. But I can't speak a single word of French. Believe me, I worked really hard those 8 years just to get a passing grade, and if I weren't exceptional at Physics (won 1st prize at the National Physics Olympiad, and many other national contests), I'm pretty sure my teacher would have failed me.
I spoke my first word while being 3 months old, and at 6 months I was using coherent sentences and engaging in conversation with people. I guess I used up all my language acquisition skills in that stage of development.
Acquiring languages isn't about going through books and studying—these are methods that adults can use to make use of the metalinguistic knowledge acquired from their native language. There's a pretty good popular-science book on the topic, Becoming Fluent: How Cognitive Science Can Help Adults Learn a Foreign Language, by Richard Roberts and Roger Kreuk (MIT Press, 2015). https://www.amazon.com/Becoming-Fluent-Cognitive-Science-Lan...
People acquire languages, they don't study them. Study is just a way of speeding the process—with immersion you can gain a huge amount of passive knowledge of a language without any study or realizing you are doing so. Questions like "what books are you studying" or "how many hours did you do grammar drills" aren't very useful. I suspect that the typical high school foreign-language study is a sneaky way of enforcing the US as a society of Anglophone monolinguals. If the US wanted a multilingual population, they would permit dual-language immersion elementary schools, but those have only been popping up in the last decade or so.
How much time do you spend fumbling around in German without falling back on English? How often do you try to communicate with German monolinguals? Is your main problem comprehension or production? If you've been living in a German-speaking country for 5 years, your passive knowledge of German is probably better than you think. The heart of a language is in its words more than its grammar, but there are just a lot of them.
> Acquiring languages isn't about going through books and studying [...] People acquire languages, they don't study them.
I know. Absolutely agree.
> There's a pretty good popular-science book on the topic, Becoming Fluent: How Cognitive Science Can Help Adults Learn a Foreign Language, by Richard Roberts and Roger Kreuk (MIT Press, 2015).
Thanks, I will check it out.
> Is your main problem comprehension or production?
I think my main problem is production.
> If you've been living in a German-speaking country for 5 years, your passive knowledge of German is probably better than you think.
You might be right, I know a lot of words. When it comes to cooking stuff, sometimes I know words my german-speaking friends don't know (I like to cook and buy lots of less-than-usual ingredients).
> The heart of a language is in its words more than its grammar, but there are just a lot of them.
As a native German speaker I would say that grammar plays a much larger role in German than in English and I personally cannot even imagine how one can learn German without grasping the grammar (I wrote more about that in https://news.ycombinator.com/item?id=12328906 and https://news.ycombinator.com/item?id=12243298). So my recommendation is (as I wrote in https://news.ycombinator.com/item?id=12196101) to do brute training on the grammar until you are very confident that you know it inside out, since otherwise you will always have to stop in the middle of a sentence how this verb is conjugated or this adjective + noun is conjugated - not good for fluent conversation.
EDIT: I have read multiple times that many people have problems with learning German on Duolingo since there is too less focus on grammar. Doing this might work for English and perhaps for Spanish but surely not for German.
I sincerely doubt you spoke your first word at 3 months old, and there's no way in hell you were using coherent sentences at 6 months old.
That being said. Have you tried Duolingo? It's free, and it's less than 10 minutes a day. So even if you don't learn a single german word (i've learnt enough in a year doing 10 minutes of german a day to teach my 3 year old a fair bit, and read basic kids books to him) at least you won't be so frustrated that you have to sleep for the rest of the day.
> Have you tried Duolingo? It's free, and it's less than 10 minutes a day.
Thanks, no, I haven't heard of this. I will try it for sure!
> I sincerely doubt you spoke your first word at 3 months old, and there's no way in hell you were using coherent sentences at 6 months old.
When my grandmother told me this, I called total bullshit. But then I asked around and all my relatives and neighbours corroborated this. Since this was so unusual, they were so impressed at the time that they could remember specific conversations we supposedly had. Yes, parents and auts exagerate these things, but random neighbours?
I still found this hard to believe, but there are documented cases about people doing this[1].
I was a very gifted child, I learned to read somewhere around two years of age by studying the book while my mother or grandmother read to me. I also learned arithmetic by watching people use cash at the store. Numbers were so fascinating to me that somehow my parents got me a cash register to play with.
These things I remember clearly. Nobody thought me these things, I figured them out myself.
I was fascinated with money. I had a bank (actually my cash register) where my mother would put all the change. I charged a fee for both deposit and withdrawl, but I paid interest too. I cut all the coupons from newspapers and magazines, and demanded that my parents use them. All the savings from the coupons went to me, and from that revenue stream I payed my parents compound interest.
I also had a store where every product cost $1. We would go to the supermarket, and buy all kinds of household stuff that then I resold to my parents for real cash. I kept very good records for my bank and for my store. It felt very important to me that every transaction be recorded. I always gave to the customers carbon copies, so they had a record too.
Maybe I should start a business instead of doing this computer nonsense.
Unfortunately my grandmother thought I had autism rather than exceptional intelligence, and dragged me from psychologist to psychologist, who gave me all kinds of tests. They started with IQ tests (always scored over 160), but once they found I was gifted, they put me do all other kind of tests relevant to their current research. My grandmother was not happy they didn't really treat me, so she sent me to various psychiatrists instead.
My family pretended we were really, really poor, even though we were not, and didn't buy me the books and computers I needed. They sent me to a school full of stupid, violent, and quite literally criminal children.
By age 8 or 9 I have mastered feynman lectures of physics and I was studying particle physics. But I was slowed down by the lack of books and learning material.
I hated school. What a waste of my life. I stumble upon all these child prodigies who finished college at 12 while I had to endure that stupid school and lacked the most basic education appropiate for myself.
Remembering all this made me feel depressed, so I will stop now. There was a point I was trying to make (how I learn everything by myself, inductive and top-down, and how it's hard to learn bottom-up), but it's too difficult to write all this.
You're too arrogant. If you want to learn another language, you need to be humble. Fix that and go out and speak with your fellow Germans: poorly at first, but with time you will do better.
I think this is exceptionally good advice, but phrased in a way that I can't imagine it being accessible to the target audience. People who feel they are smart are often stymied when they run into their own humanity. It's a journey of discovery. Embracing failure is hard, humbling and for some people literally humiliating. That the humiliation is healthy is something that takes time to understand ;-)
Yeah, and especially when learning a language....because you are going to make mistakes, and people will laugh at you. You may have the experience of talking to a 4-year-old, and realizing the child can speak better than you can.
But you humble yourself, keep going, and then you learn the language.
Why do you want to converse with people in German, rather than English? If you don't have a satisfactory answer to that question, you probably won't be able to speak German. "Because I think I should" is not motivating.
People learn languages best by simply listening to others talk. Hang out with folks that are speaking German. Tell them you want to hear no English at all, all night. Do this for a few weeks, maybe months, and you'll be speaking German soon.
Edit: Maybe I should add that you should focus on comprehension before speaking. Try to simply understand what other people are saying, first. While at an intermediate level, you'll find that you can understand a friend well, but a stranger almost not at all. It'll take some time listening to a variety of people before you can understand many speaking patterns.
There is the theory that areas of our brain that were used for language acquisition as young children are "repurposed" as we age. From an evolutionary perspective, it is advantageous to commit a lot of brainpower to acquiring language initially, but thereafter this is not a particularly critical skill (esp. in traditional small monolingual communities). So I'd suggest that in your case this process has just been carried out to an extreme degree.
I also had 13 years of french instruction and even lived with a francophone girlfriend for several years. I never learned to speak french. On the other hand, I have learned Japanese as an adult without taking any courses.
There are 2 main things to understand. First is that it takes a very long time to learn a language. Many people completely underestimate the task. From the age of 3 years old, humans acquire about 1000 word "families" per year (police, policeman, police truck, policing would all be one word "family". Conjugation (even irregular) is counted in a word family too). This happens up until they are 20. Most university educated adults have more than 25,000 word families in their vocabulary. Even at 5 the average 5 year old has 5,000 word families in their vocabulary.
While you may have been an early developer with language, I think it is likely that your impression of your early abilities is hampering your current development -- mainly because it sets unreasonable expectations. At 6 months old, you may have had a vocabulary of 500-1000 words (it is not unheard of). Most children usually have a vocabulary of 1500 word families by the age of 3 and 5000 by the age of 5.
I recommend talking to 3 year olds and 5 year olds. Their command of their native language is truly awful. And yet, instructors of various languages imply (and sometimes outright say) that their course with 500 words of vocabulary (and a smattering of grammar) will allow you to "learn" the language. Adult level fluency is somewhere on the order of 15,000 word families. Even advanced learners with 7-10K words of vocabulary often can not understand movies without subtitles!
The main problem that people tend to have is that there is a huge disconnect between what they expect to accomplish in a short time frame and what they actually can accomplish. People bandy around ideas like "You can have adult like fluency with 2000 words of vocabulary", but it is absolutely wrong.
Basically, if you learn 20 new words a day (which is not unreasonable with spaced repetition software), you can get to 15K words in 3-5 years but you have to study every single day without a break. More likely is that you can reach that kind of level in about the same time a child can -- 15 years.
Second issue is grammar. There are only about 1500 grammar rules in a typical language. You can rip through them in much less than a year. At this point you will have a fascinating understanding of the language and still be unable to order a drink at a fast food restaurant.
Language acquisition is different than learning (I think this is now the accepted theory, but it's actually a fairly recent development -- in the last 30 years or so). Originally people thought that you memorised language and then practiced it to get faster and faster. Current theory suggests that acquired language springs to mind without pre-planning (we have an associative memory after all). When you are in the correct context (i.e. you want to say something), the correct thing just pops in to your head. Similarly, when you listen to something, understanding occurs without logical analysis. If you have learned language, you are essentially using look up tables and logic to divine the meaning. If you have acquired language, then you go straight from language to meaning (or vice versa).
Studying grammar helps you learn grammar, but does not necessarily help with acquisition. The current accepted theory is that repeated exposure to language that you can understand (usually from context) leads to acquisition.
So what are the practical tips? First, having a course 2-3 times a week will lead to exactly the result you have described: you will not learn the language. You must study every single day (well, my experience has been that 5 days a week will still lead to very slow progress, 4 days will keep you at a standstill and anything less than that will be a treadmill of instantly forgetting what you learned).
Study does not need to be the kind of study you are used to. The absolute best way to learn a language (according to the literature) is free reading. Pick up something you want to read and read it. It should be level appropriate... which is a pain for any language other than English (for which there are about a billion graded readers).
I have found an interesting technique (which I did independently, but later found that it is a very common technique ;-) ). Pick up any book/magazine/whatever that you want to read. Skim it and write down any word you don't know. When you get to 20, look them all up in the dictionary. Use whatever technique you like (spaced repetition software is awesome) to memorise the definitions of the words. Once they are memorised, read the passage again. If you still can't understand it, look up any potential grammar issues in a grammar dictionary. Write down the sentence with the grammar you want to learn along with your own translation of the sentence (after you puzzle it out). Memorise that sentence. Rinse and repeat.
If you are the kind of person who doesn't mind rote work, you can simply memorise all the example sentences in a grammar dictionary (or text book -- although text books suck for the most part). But you still need to do free reading to encounter it in the wild.
For listening, watch TV (or video web sites). For speaking, I have 2 techniques. Read and memorise songs for any genre of music that you like using the technique above. Get recordings of those songs. Sing along. The other technique is to find a news web site that has video news stories as well as a printed version of the story. Read the story. Listen to the story. Then try to read the story at the same time as the news reporter. Try to match pacing, intonation, etc exactly. You can record your voice and compare with the original.
Finally, you need conversation partners. If you live in a large city, you will almost certainly be able to find language meetups in your area. Go and chat with people. Otherwise, there are language exchange websites where you use skype or whatever to chat to people. Being a fluent English speaker will guarantee that you can find as many partners are you like.
Hope that helps! I've had success with these techniques myself, and I have taught them to my students when I was teaching English as a foreign language. I find them very effective. Just temper your expectations and never take a day off. Also realise that courses can never teach you a language because the language is too huge. They best they can do is teach you techniques that will allow you to learn a language. If you are taking a course and it is all about the content, find another course.
No, I can't speak a single word. I try to remember words, but it's just nothing there. I can't think of a single word.
If you show me some simple words, I can probably remember what some of them mean, but I can't retrieve any word from my mind. It's just like the sensation where you want to say something, but can't remember a word, and you try real hard to remember that word. Except that's all I feel, only that sensation. There are no words I can remember.
Of course now I opened a french website, and tried to read it. Yes, I recognise some words, and there is even a minimal level of understanding. But I could not think of those words before reading the text.
Yea, I was reading through the article and wondering why they didn't mention actual researchers. Where's the discussion of Tomasello's work, for example? Then down at the bottom, I saw:
More to Explore
Constructing a Language: A Usage-Based Theory of Language Acquisition. Michael Tomasello. Harvard University Press, 2003.
Constructions at Work: The Nature of Generalization in Language. Adele Goldberg. Oxford University Press, 2006.
Language, Usage and Cognition. Joan Bybee. Cambridge University Press, 2010.
Some of the usual suspects. Further discussion of these would have improved the article immensely. I generally agree with the article, but it has a lot of nits to pick—it mischaracterizes the debate over Pirahã and recursion, for example, and significant evidence against the pro-drop parameter actually came from Western European languages.
The ruffling seems to come from the term: universal grammar. Without a proper understanding the term can cause all sorts of misapprehensions. It sounds like a mandate; it makes people rebellious. Noam Chomsky specifically disavows any evolutionary theory. He just hand waves it for the establishment of the model. Many people who "refute" him simply attempt to provide this. All he intended to build was the simplest concievable system for language parsing. That's all he ever claims to have. When he responds to criticism it is always staunchly from this context. People just don't care to understand the propositions of the actual idea. I suspect they're intent on invalidating him to undermine his political views.
Many -- perhaps even most -- of his fiercest opponents within linguistics agree strongly with most of his political views. George Lakoff, for example, is well known as a leading anti-Chomskyite linguist, and also as a vocal far-leftist whose political views largely echo Chomsky's.
I'm not so sure that it's correct to call Chomsky a leftist. He clearly is not a Marxist and finds the ideology ridiculous, at the same level as religion. If Lakoff is a Marxist or socialist, then the two clearly do not have political views that are compatible.
Chomsky is, I believe, best referred to as a social libertarian, which means that he opposes both state and enterprise power and see them as more or less the same, and instead favors the individual. It is clear here why Marxists don't like him, at all.
I think this is also why the authors of the OA challenge Chomsky without offering anything other than minor claims hidden away throughout the text that language is a product of social interaction rather than a capability that lies within the individual. It is about the collective versus the individual and as a fierce individualist, the authors disagree with Chomsky on political grounds, not scientific.
Not all leftists are Marxists, though they have come to dominate the field. The broad anti-capitalism of anarcho-syndicalism, combined with his well known criticisms of American and western power for so called imperialism, definitely slot him into the "leftist" category in my political schema.
> Not sure why linguistics ruffles some people's feathers.
My hypothesis is that it's Chomsky personally who ruffles people's feathers. And this is not because his work on linguistics but rather his political views.
I really doubt it's his politics. George Layoff and Chomsky had (have?) a protracted, acrimonious battle over the relationship between syntax and semantics, but they're not miles apart, politically.
I find Chomsky frustrating because he comes up with these incredibly elegant theoretical machinery--and then is totally disinterested in empirically testing then.
For example, the "Language Acquisition Device" absolutely HAS to be the brain, but a lot of the discussion of it--and Universal Grammar--are totally untethered by psychology and neuroscience.
For a while, it seemed like he regarded recursion as a key aspect of human language, versus animal "communications." Fitch, Hauser, and Chomsky had a 2002 article in Science that seemed to argue as much, and they were excited to show that humans, but not monkeys, can learn center-embedded recursion patterns[1]. However, Chomsky seems to have subtly backed away from that and now says that recursion important but might not manifest itself in all human languages (or something along those lines).
That somebody genuinely disagrees with Chomsky's linguistic theories on their own merit doesn't mean that other people don't want to prove his linguistic theories wrong because they disagree with his politics. Both sorts of people can exist.
George Lakoff is one of his fiercest critics and I have a really hard time believing that he is motivated by Chomsky's political views; Lakoff is involved with various progressive and socialist think tanks himself. I'm not sure about (e.g.) Postal, Ross, or McCawley but I've never heard anything about their political animosity to Chomsky either.
If one wanted to go after Chomsky's political views, falsifying Universal Grammar (whatever that means) seems like an oddly difficult and indirect way to do so. Do you really think that someone decided that a deep dive into recursive structures in Pirahã is the way to push for TPPA? I'm totally willing to believe that more conservative linguists might get a tiny bit more of a thrill out of needling him, but again, this seems really unlikely to be a driving force.
If there's any political force that drives people to go after Chomsky, it's the one driving the academic job market. He is A Big Name, and conclusively debunking his theories would set the debunker's career on the fast track. In fact, even a high profile debate with him, assuming it wasn't a complete rout, would probably benefit most careers.
I get the impression that it goes beyond a disinterest in empirical testing; he (and his acolytes) seem to be unreasonably hostile towards anyone who dares to question his elegant theories - or point out that his position has changed.
Chomsky's politics have gotten him nearly blacklisted by mainstream media sources. They have not impacted his academic career in any meaningful way that I can detect. In fact, they have almost certainly amplified his fame and notoriety amongst academic circles, including in Linguistics, which has similar politics to most of the rest of academia i.e. it is extremely left wing.
as for the man's personality and interpersonal relationships, he is quite calm, even-tempered, and respectful. he is not known for ad hominem attacks or public meltdowns. quite the opposite. he's a bit dry and boring to be honest.
He's hugely influential in the field; arguably dominant in a way that is unhealthy for a field. Acolytes will say he's influential because of merit, detractors will say it is because of rhetorical skill, sophistry. I hope there will come a time when press releases no longer breathlessly announce Chomsky's theories falsified for the umpteenth time, but rather actual progress.
Chomsky nails politics and 'how the world works.' It is easy to understand why he is censored out of the news media. Not to go off on a tangent, but a brief opinion: I have made it a hobby for two decades comparing the coverage of specific news stories in different countries. I have noticed the censorship in the USA has become a much worse problem in the last few years.
Some of the context of this can be found in the 'Poverty of stimulus' wikipedia entry. Chomsky argument was tried to be cast in theorem form in a paper of E. M. Gold to apparently settle the question (in favour of Chomsky), but despite that result the computational linguists have been busy making useful work in NLP with machine learning methods, and now the debate has shifted from armchair discussion to the question of the interest of the emerging new language technologies and the role of unsupervised learning. Less famous, Zellig Harris (Chomsky thesis advisor), is now cited in deep learning language modelling literature as a precursor of the distributional model pushed by Bengio that is now used in word2vec and what not nlp.
I believe the chart shown in the article titled, "Noam-enclature And The New Linguistics," mischaracterizes Chomsky's theory, by conflating the commonly used meaning of the word, "grammar," with grammar as it was defined in Chomsky's paper. In essence, this chart is a straw man argument.
The top part of the chart shows, "the brain's innate sentence diagraming machine," and then points toward a sentence structure broken down into designators, adjectives, nouns and verbs, which is what the common definition of the word, "grammar," means. This is not what Chomsky's theory stated. Chomsky's theory defined grammar as (I'm paraphrasing) a system for understanding language, involving a syntactic component (e.g. there should be some kind of order), a semantic component (e.g. there should be some kind of meaning) and a phonological component (e.g. there should be some kinds of sounds).
That's it! There was no inherent underlying English grammar structure hard-wired into babies' brains as the chart suggests. What is wired into a baby's brain, which assists with language, is the ability to order sensory inputs in a way that would result in language...that is what he called grammar. Basically what Chomsky did is, "psychologize," linguistics and I am not convinced by this article that this theory had been refuted. That would be like saying that Einstein refuted Newton.
> There was no inherent underlying English grammar structure hard-wired into babies' brains as the chart suggests.
There was in Chomsky's original theory. His original claims have been significantly mellowed out after the mounting weight of counter-evidence crashed down.
No, there wasn't. That's the poster's point. Universal grammar in Chomsky is lower-level than any concrete language, English most definitely included. https://en.wikipedia.org/wiki/Universal_grammar
The first two sentences should suffice.
The characterization on that page is not good: virtually no one disagrees with "the ability to learn grammar is hard-wired into the brain". The substantive disagreement is whether grammatical categories and relations (or other knowledge that helps the child induce such) is hard-wired, or whether more general-purpose learning mechanisms are sufficient to abstract this structure from the environment.
>virtually no one disagrees with "the ability to learn grammar is hard-wired into the brain".
No, this is exactly where the field is split. Chomsky says that if a child socializes with people speaking a language with each other, the child will learn that language without teaching, punishment or reward. I.e. the human mind is not a blank slate buy has innate capabilities. Ideologically, this is an individualist take on linguistics.
The other side says that children have to be taught to speak a language properly and that language is difficult. Ideologically, this is the collectivist take on linguistics - children are blank slates to be filled in by society.
The people that have a problem with Chomsky's ideas are authoritarian collectivists. It's not about science.
You're making a straw man caricature out of the nativism detractors. Current researchers do not claim that first language is taught, that's why the field is called language acquisition regardless of whether one subscribes to nativism. The more nuanced non-nativist theories argue that grammar learning employs domain-general cognitive mechanisms, rather than language-specific innate grammar knowledge or principles. It's quite a stretch to connect any of this with political ideologies.
>In the second half of the 20th century, it was becoming ever clearer that our unique evolutionary history was responsible for many aspects of our unique human psychology
In my experience, most "evolutionary psychology" stories are also ad hoc, unfalsifiable at best, and loaded with cultural biases from those who propose them. We don't have a good idea of what traits actually carry a selective advantage, and those advantages are often highly context dependent. The brain, culture, and evolution are both incredibly complex, and so you can often find some way to smush puzzle pieces together that almost works, but that doesn't mean that's what's actually happening.
A lot of traits and variances have no clear evolutionary advantage, most of the time it's fudging and trying to come up with an explanation for a certain behaviour
There aren't good evolutionary explanations for a lot of physical traits, let alone for psychological ones
> when linguists actually went looking at the variation in languages across the world, they found counterexamples to the claim that this type of recursion was an essential property of language. Some languages—the Amazonian Pirahã, for instance—seem to get by without Chomskyan recursion.
This is a pretty thin counter-example. A Christian missionary who became a linguist claimed that the Pirahã language had recursion. Then he later claimed that it did not have recursion. Linguists went down to study the Pirahã and found out that his first reports of Pirahã being recursive were correct, as they observed recursion in Pirahã language when they were there.
The authors return to this language spoken by less than 400 people later on:
> Chomsky defenders have responded that just because a language lacks a certain tool—recursion, for example—does not mean that it is not in the tool kit... makes Chomsky’s proposals difficult to test in practice, and in places they verge on the unfalsifiable.
Recursion in languages has not been falsified - it has been found in all languages, including Pirahã.
I think you're giving Dan Everett (the person you're describing) a short shrift here.
He says that his initial pro-recursion reports were attempting to fit Pirahã into the standard Chomskyian UG framework of the time. However, as he learned the language better, he started to realize that it didn't really fit and he was seeing the things he expected to see. For example, he initially described -sai as a nominializer (an "adapter" that lets other parts of speech be used in constructions that expect a noun). His later reports argue that it's actually not a nominalizer, but marks old information ("As you know Bob").
As far as I know, there is not actually a ton of non-Everett data on Pirahã.
The most well-known paper that disagrees with Everett is probably Nevins, Pesetsky and Rodrigues (2009), which reanalyzes Everett's published work. They start with a list of properties are are supposedly unique to Pirahã, and claim that many of these show up in familiar languages. However, I don't find that paper completely convincing. For example, Pirahã possessives can't be embedded: the equivalent of "John's brother's house" is ungrammatical and needs to be rendered as "John has a brother. The brother has a house." NPR argue that German has a similar restriction. This is true, sort of, in that "Hansens Auto" (Hans's car) is grammatical, but "*Hansens Autos Motor" ("Hans's car's motor") is not. However, this occurs because of a totally unrelated restriction: you can say something like "Hansens Vaters Frau" (Han's father's wife) because everyone involved is animate, and there's an alternate route involving "von" for embedding other kinds of possessives (as in "the motor of John's car"). Pirahã, as far as I know, does not have any possessive embedding at all, which does make it different from German.
I'm also not sure that cherry-picking individual examples from many other languages successfully refutes Pirahã's exceptionalism. German may prohibit embedding in this circumstance and Hindi in that one, but surely the interesting part is that Pirahã seems to prohibit it everywhere. As an analogy, I know people who do not write with their left hands, and other folks who don't write with their right hands. However, this doesn't imply that hands aren't involved in writing at all.
Realistically, this debate is going to be hard to settle without some more data. Unfortunately, Pirahã is said to be impossibly difficult to learn and there aren't a ton of native speakers wandering around universities. There is some data kicking around from another Australian language which may have similar properties, which might help.
There is no consensus about this question. Both sides clearly have an ax to grind. Everett knows the language best, so I'd lend more credence to his judgment. Regardless of the answer, it only means that Chomsky's theory that everything is about recursion may or may not have been falsified, not that it is a good theory.
While Chomsky's attempt to formalize a grammar is wonderful, the probabilistic approach (based on weights) to language advocated by Norvig is, it seems, more close to what is going on inside the brain.
Brain does not perform recursive algebraic simplifications (which is, probably, the most beautiful abstraction humans have produced - fixed points, Y-Combinator, etc), it performs structure-based pattern matching - "it looks like a duck".
Natural selection, it seems, would select a crudest, good-enough model, full of cues, short-cuts and heuristics learned from environment, instead of evolving a beautiful, universal recursive meta-circular evaluator, which MIT guys are worshiping (what else there is to worship, but a beauty?!)
And, of course, mutually recursive processes (physical processes which could be abstracted out and formalized that way) are everywhere in biology. It just happen that language is a pattern matching, not an algebra, like everything else in the brain. Algebras, like numbers, does not exist outside our heads.)
Algebra and pattern matching are basically the same thing, aren't they? To find a solution to an algebraic problem, you pattern match on the problem structure and see what falls out. To solve a pattern matching problem, you perform algebraic transformations to isolate the pattern being matched.
>Algebras, like numbers, does not exist outside our heads
Models are called models because they are not the things they model. Anything that is a model is an approximation. This doesn't mean they don't exist outside our heads. If that were true, there wouldn't be any computers.
The meaning of the term pattern-matching here is "mechanical" rather than "mathematical". Think of shapes of the proteins in an enzime. Their structure defines behavior of an enzime, and only a molecule that match (with physical shape and electro-chemical properties) will do the thing.
Similarly, it seems, a higher level behavior is defined by physical structure of neurons, a highly-specialized circuitry evolved by the process of trial and error, which happen to be good-enough for its purpose. There is no algebra involved. It is a electo-mechanical massive-parallel machine.
> Algebra and pattern matching are basically the same thing
The difference is that pattern matching is done on a model of the world provided by words, instead of applying it directly to senses.
In reinforcement learning there is an approach to model the world first, then run RL on top of the model, because that would provide a way for the agent to imagine possible outcomes without the need to try them first in actuality. Model based RL is a refinement of direct RL and is essential for thinking.
The article makes it sound like Chomsky's early approach was similar to the grammar you learn in school. That's actually what linguistic classification was like before Chomsky as I understand it. It's first mention of recursion is of a Chomsky paper from 2002, where everything is done with just a few rules, but recursion was also critical to Chomsky's earliest results.
Coming away reading it I would think Chomsky had a really naive approach that had to be modified with Principles and Parameters just to handle Spanish first person verbs, because his program was so narrow they hadn't bothered to consider Spanish. Then it wasn't until 2002 that they started handling recursion.
It doesn't quite say that directly, but that is just what I think someone might take away from reading the intro.
Anyone else feel that ending the first section with the well-known "science progresses one funeral at a time" quote is rather too sharply pointed, given Chomsky's age?
My colleagues and I just got a paper accepted at Psych Science on this debate: we looked at the degree of generalization kids show in using determiners/articles ('a','an' or 'the') with nouns, also accounting for imitation of caregiver speech. If a child has heard "the cat" and "a cat", but only ever "the fish", when do they know they can generalize and say "a fish"? TLDR is that it's hard to say conclusively without larger samples of caregiver speech, but it seems like early speech (<2 yrs) has little evidence of category-based generalization, but it rapidly increases after that. Of course it's possible to contest that kids have more grammatical knowledge than what's reflected in what they say (the performance vs. competence distinction), but these methods can't address that.
HN readers may appreciate the methodology: we construct a beta-binomial model as a simple approximation of the generative process, and measure how the parameters change over time when we fit to read data. The child's definite vs. indefinite uses for each noun are treated as flips of a weighted coin; the weighting is determined by the definite and indefinite uses by the parent combined with a beta prior. The peakedness of the beta prior indicates how similar a kid thinks nouns will be to one another (in the absence of evidence). We fit this model to all the successive samples of a bunch of longitudinal developmental datasets, including the Speechome project out of MIT. Nonzero, constant peakedness would be consistent with access to syntactic categories from birth; increasing peakedness of the scale parameter corresponds to growing syntactic generalization.
After reading the article, I still have to side with Chomsky on this one
Universal grammar may not exist (exactly) as he described but "young children use various types of thinking that may not be specific to language at all—such as the ability to classify the world into categories, and to understand the relations among things"
And what says this is not language related? It's an underlying structure that helps organize concepts. Who says thinking in shapes or concepts is not a language?
Also, it's not because some languages have not evolved recursion that it's not a defying trait. In the same way western languages are not tonal but it doesn't mean a western kid wouldn't be able to learn them
I think you've struck on something, sort-of. How's this Neo-Chomskian theory:
The structure of the Universal Grammar and its related parameters is not encoded in the brains or genetic codes of humans, but in the very nature of the universe. The Universal Grammar is universal. If you wish to describe an action, any action, you need a subject, a verb, and an object. The verb must be transitive, and the subject and object must be both nouns but distinguishable from each other. In other cases where there is no part of reality that takes the place of an object, you can have an intransitive verb, but then you cannot have an objective noun.
The parametrization of the language, such as in word order (SVO, VOS, etc.) occurs through the historical development of the language, by symmetry breaking and the requirement that language be a one-dimensional, temporal structure.
The relationship between language and the human brain, or even the human mind, is largely accidental. The mind may or may not have a "recursion" module, or slots for "subjects" and "verbs", but whether it does or does not is immaterial. The Universal Grammar is the fundamental nature of reality, not the nature of the brain. Whether the brain has such structures or not depends entirely on how well the brain represents reality, and we already know that is imperfect.
In the beginning was the WORD, and the WORD was with god, and the WORD was god.
> ability to classify the world into categories ... what says this is not language related
It is data related. The patterns exists in the data itself, young children are just uncovering the latent factors that explain the data. So, the ability of children is to learn from patterns.
> It is data related. The patterns exists in the data itself
This is a cop out. You can't get semantics (and grammar) out of "types of words" only (or you couldn't have declensions)
The important thing about recursion is not that it doesn't exist in some places, but the fact that it exists.
The language capabilities of the brain seem to be an "hardware accelerated" heuristic that allows things to be connected more easily
Given that cognitive psychologists can't even reproduce the experiments they're supposedly experts of, I'm not holding my breath from an explanation from them
"I have not failed, I have just learned thousands of of ways which don't work" applies to science as well. It bothers me popular science often presents ground breaking work as a direct competitor to an established theory, while usually the established theory is a necessary stepping stone to the new one. usually, because it's more obvious approach and explains some aspects of the system. The old theory can be used as an instrument to gauge the qualities of the system under observation, and incompatibilities found provide then fertile material for new understanding.
Unfortunately, this is often the rhetorical approach commonly employed in unpopular, ahem, professional science as well. No one wants incremental progress, everything has to be transformative (science's disruptive).
Fortunately, this has yet to pervade other industries we all know and love.
The article, and many professional functionalist lingusts, confuse Chomsky's Universal Grammar theory and its later "parameters" revision with the much more general hypothesis, also advocated by Chomsky, that human language grammar is somehow computable in a logical sense, formed through the application of objective and universal (with a small "u") logical rules. They prefer to view language rather as some sort of nebulous and inherently unquantifiable expression of the human spirit that is intrinsically bound to the unique cultural attributes of the society that produces it, a precious snowflake different from all others, which cannot be related to other languages through universal logical principles.
They feel that the notion of universal (small U) principles of grammar somehow robs language of some essential human qualia by reducing it to a mere mechanism, and moreover many postmodernist linguists believe that the very idea of objectivity in language, to include supposed universal grammatical principles, is somehow part of a broader imperialistic plot by evil white heterosexual men to oppress and subjugate the glorious Other through malicious efforts of universal grammatical classification. (Yes, seriously. Your tax dollars support more than a few professors who believe this is true, in all seriousness, and teach it to impressionable young students every day.)
These fears do not, however, have any bearing whatsoever on whether human language is, as a matter of fact, computable, the result of some set of universal logical principles. It either is or it isn't, and wishful thinking about how we would like to conceive of language has no bearing on whether that is true or not. That is not how scientific knowledge works. (Of course, many denounce the scientific method itself as merely another tool used by evil white straight men to oppress the Other, and this despite the fact that it was invented by Arabs.)
The anti-computation camp frequently conflates Universal Grammar or parameters theory with the question of computability in general. When any evidence comes out that shows that a specific portion of UG is probably not true, they crow triumphantly about how language is really fundamentally something ineffable and pseudo-mystical, not able to be computed by logic.
UG and the computability question are not the same thing. UG is a specific proposed model of language computation. It has largely been falsified, shown to be inconsistent with the empirical evidence of how actual languages work. That is fine. This does not imply anything at all about whether language is universally computable according to some model.
It just tells us that language is not computable according to that model, not that it is not computable according to any model.
>> When any evidence comes out that shows that a specific portion of UG is probably not true, they crow triumphantly about how language is really fundamentally something ineffable and pseudo-mystical, not able to be computed by logic.
I think the overwhelming mass of work critical of Chomsky treats it as a computational object; the emphasis changes from global computability to local inference: what can a learner infer about the structural properties of a language? How can people parse complex sentence structures and ascribe meaning to them? Hence the rapid adoption of machine learning for explaining human behavior in psycholinguistics, developmental linguistics, and historical linguistics. Nothing pseudo-mystical there.
Sure, of course not all linguists who disagree with UG are anti-computationalists. (I don't know about "the overwhelming mass," but that is possibly the result of my exposure to a local concentration of anti-computationalists.) Many tentatively accept some sort of computability while rejecting the specific model of UG, as you correctly note.
My comment was directed at the author of the specific article, and at many functionalist and postmodernist linguists, who actively conflate the question of computability in general with the validity of UG as a specific model.
So, if there is no innate grammar, why the phrase "large red ball" feels more correct than "red large ball", and in many languages (I can confirm only two)? If this is just a learned relationship between things, the order of the adjectives shouldn't matter. Furthermore, in Czech, words in sentence can be in many different orders (in particular, the ending word is the one that has emphasis), but order of adjectives still matters (or at least some feel right and some wrong). So we can learn and deal with different orderings of words.
> So, if there is no innate grammar, why the phrase "large red ball" feels more correct than "red large ball"
Word order is just dependent on language. In Romance languages it would be "large ball red"
No, the question is deeper.
Is there an innate parsing "hardware" that can parse (and produce) a phrase based on an structure? What are the limitations of that structure. (And I'd say that if there are no limitations this article is right)
There are several grammars that can encode "2 + 4 > 3" for example, you can express the same thing with a Lisp grammar, but beyond the grammar specifics, you're expressing a meaning
Oh my god, a png of a twitter featuring a png. What is this world we have created.
Anyways.. it's pretty damn interesting. When I read it, I seriously read the last sentence as "great green dragon" without thinking, then went back and saw that I actually read the words in the wrong order.
Oddly enough, a green great dragon makes me think there would be some distinct species of dragon called the "great dragon", that are sometimes green, whereas "great green dragon" makes me think of just a very large, green, run-of-the-mill dragon.
However, all of this seems to me entirely related to training, getting very very used to things being expressed in a certain way. I have no idea how it could relate to some "innate" idea of grammar, especially since, as others have pointed out, this rule and many others only hold for English. Any bilingual individual can tell straight away that these things are all about memory -- nothing "innate" about a bunch of arbitrary rules.
It can be, but the same order comes up naturally in many languages. If it really doesn't matter and is just a learned convention, it could be different in different languages, just like ordering of "subject verb" is a convention (but I am not a language expert, I really only know Czech and English).
Thing is, in order to disprove your example as evidence of "innate grammar", it suffices to find one language that doesn't order it that way. I bet there are quite a few where the order doesn't even matter at all (because there are other means to indicate what's what in a sentence).
> I bet there are quite a few where the order doesn't even matter at all.
One of my favorite Latin poems (not just mine, this one is famous for a reason) begins like so:
Quis multa gracilis te puer in rosa
perfusus liquidis urget odoribus
grato, Pyrrha, sub antro?
Milton apparently translated this "almost word for word without Rhyme according to the Latin Measure, as near as the Language will permit" (http://www.dartmouth.edu/~milton/reading_room/fifth_ode/text...). Here is the relevant part of that translation:
What slender Youth bedew'd with liquid odours
Courts thee on Roses in some pleasant Cave,
Pyrrha[?]
And here, I'll rearrange Milton's words into the word order of the original poem:
What [many] slender thee youth on roses
bedew'd liquid courts odors
pleasant, Pyrrha, in cave?
(Milton's "with" is reflected in the Latin, but it is the ablative case ending of "liquidis odoribus" rather than a word in its own right. The "some" is not indicated in the Latin. "multa" is an adjective, "many", applying to "rosa"; Milton has left it out, although he has changed the oddly singular "rosa" to plural "roses".)
Anyway, this poem is pretty much the apex of artistic expression of "free word order". Latin does have some restrictions on word order, and one of them is visible in the poem: the object of a preposition must immediately follow that preposition (this is the origin of the belief that sentences can't end with a preposition -- in Latin, they can't). But mostly you can place the words where you'd like.
Thank you for the example. The "large red ball" case brought up by OP is slightly different, though, because it's not about subject/object/verb ordering, but rather about relative ordering of adjectives.
Even in languages that do allow for relatively free reordering (my native, Russian, is also like that, and also uses it heavily in poetry), "red large ball" just feels awkward compared to "large red ball", so I think there is, indeed, some actual linguistic law behind it that applies to at least these languages.
The stretch is to claim, on the basis of a few (or even a couple dozen) languages, that the law is indeed universal. It's the exact same mistake that European linguists have made before, because they were basing their assumptions on a bunch of closely related languages that they were familiar with - and once they started studying more "exotic" stuff, like Native American and Polynesian languages, it turned out that many of those assumptions were only true for that original subset they were working with, and didn't apply in general.
> why the phrase "large red ball" feels more correct than "red large ball",
Because size is relative quantity(large as relative to what) while red is absolute descriptor(color=red).
I don't think this is a sufficient explanation, only empirical observation. It could very well be other way around. I mean it's not even generally true that related words come together in sentence (things like "Who did you give the large red ball to?" spring to mind).
Here and there, are sprinkled vague nods towards unnamed "researchers" and "the new way" but not many specifics on who, when or how. Mostly it's just: Trust us, here's how it is now.
The article uses only one name, and only aims to chip away at it. It has no new celebrities, no names stepping forward to take credit for the incredible breakthroughs that destroy the old and oh-so-inferior.
The authors, and the editor are after something. But they offer nothing (or very little) in exchange, pre-supposing that it's just common sense which prevails over the tyrrany of something innately unlikable.
Not sure why linguistics ruffles some people's feathers. It's such a dry, imprecise sociological subject after all.
EDIT: they make a single mention, in passing, of research by Harvard psychologist Steven Pinker, but very briefly, and don't return to it afterward.