what's really funny is that I worked in DNA sequence analysis at a time when the chomsky hierarchy was primal and literally all my work was applying "probability of a sequence" concepts (specifically, everything from regular grammars to stochastic context free grammars). It's a remarkably powerful paradigm that has proven, time and time again, to be useful to scientists and engineers, much more so than rule systems constructed by humans.
The probability of a sentence is vector-valued, not a scalar, and the probability can be expanded to include all sorts of details which address nearly all of Chomsky's complaints.
You misunderstand. The chomsky hierarchy was critical to my work on DNA.
In fact my whole introduction to DNA sequence analysis came from applying probabilistic linguistics: see this wonderful paper by Searles: https://www.scribd.com/document/461974005/The-Linguistics-of...
I still don’t understand. What’s funny about it and what’s DNA got to do with human language sentences (the original quote)?
Chomsky’s grammars are used in compilers and compiler theory. Even though programming languages have got nothing to do with human languages. Certainly nothing to do with the “probability of a sentence” that he was talking about. The application of something like that doesn’t necessarily tell you anything about what Chomsky is talking about, namely human language.
Funny you mention that- after working with probabilistic grammars on DNA for a while, I asked my advisor if the same idea could be applied to compilers, IE, if you left out a semicolon, could you use large number of example programs to train a probabilistic compiler to look forward enough to recognize the missing semicolon and continue compiling?
They looked at me like I was an idiot (well, I was) and then said very slowly.... "yes, I suppose you could do that... it would be very slow and you'd need a lot of examples and I'm not sure I'd want a nondeterministic parser".
My entire point above is that Chomsky's contributions to language modelling are the very thing he's complaining about. But what he's really saying is "humans are special, language has a structure that is embedded in human minds, and no probabilistic model can recapitulate that or show any sign of self-awareness/consciousness/understanding". I do not think that humans are "special" or that "understanding" is what he thinks it is.
I get it now. What’s funny is your vulgar interpretation of “language”.
Which is another pet-peeve of his: people who use commonsensical, intuitive words like “language” and “person” to draw unfounded scientific and philosophical paralells between things which can be compared with metaphors, like human languages and… DNA I guess.
You think DNA isn't a language? That seems... restrictive. To me, the term language was genericized from "how people talk to each other" to "sequences of tokens which convey information in semi-standardized forms between information-processing entities".
DNA uses the metaphors of language- the central dogma of biology includes "transcription" (copying of DNA to RNA) and "translation" (converting a sequence of RNA to protein). Personally I think those terms stretch the metaphor a bit.
The probability of a sentence is vector-valued, not a scalar, and the probability can be expanded to include all sorts of details which address nearly all of Chomsky's complaints.