Beyond the disagreement about what recursion is (as used to define the faculty of language narrow (FLN), from the Chomsky, Hauser, Fitch paper mentioned in the interview), there are significant problems with Everett's argument in general which are covered quite well in the rebuttal.
But I can recommend Everett's book "Don't sleep, there are snakes", which is only partly about language.
edit - actually I think I see it - he's not inflecting the articles correctly after "von"?
Anyhow, in German, you can use the Saxon genitive in preposition, if the word in Genitive can be used like a personal name: "ich sehe Karls Auto". And it is true that you cannot recurse on that: "ich sehe Karls Bruders Auto" is not German, except maybe in German poems two hundred years ago, when the language was raped. Even "ich sehe meines Vaters Auto" will sound outside of the norm.
However, you can recurse in postposition: "ich sehe das Auto meines Vaters", "ich sehe das Auto des Bruders meines Vaters". You do not have to say "ich sehe das Auto vom Bruder von meinem Vater".
Several edits, sorry.
>Everett stated that Pirahã cannot say "John's brother's house" but must say, "John has a brother. This brother has a house." in two separate sentences.
Compounding this on the apparent lack of numbers/counting, expressing higher order concepts seems like it'd be borderline impossible in the language as-is. I wonder if many other languages started out "simple" like this, and developed grammatical complexity alongside their society?
Of course, the issue here could just be a lack of understanding and insufficient ability to ask questions to native speakers by people researching it. It only has 250 speakers and picking it up as a second language with zero resources leaves plenty of room for error. More recent research suggests it might be possible, but no non-native speakers are really sure.
Recursive embedding is one of the foundational marks of human language (though not the only), and finding a natural language without it would cast long shadows across the existing literature while making that language's "discoverer" very famous.
His data have been challenged by a number of other linguists, despite Everett's attempts to keep others from access to first hand sources.
Note too that the existence of alternate constructions such as "John has a brother. This brother has a house." do not preclude the language's ability to accommodate embedding.
Anyway, I'm no orthodox Chomskyist (in part because I'm not well enough educated to be:) but I think all of Everett's claims should be taken with as many grains of salt as we can find in our immediate vicinity.
More recently here's a recent experiment - https://cosmosmagazine.com/people/behaviour/complex-linguist... - which seems to show that some primates have similar "recursive" reasoning skills as young (under 4) children.
Most of the literature on birds stems from a confusion of the distinction between regular string languages and context free string languages with the distinction between grammars with and without recursive rules. The two distinctions are largely orthogonal. It is certainly possible, for example, to define certain regular string languages using recursive grammars, and to define certain context free string languages without using recursive grammars.
When you say that "the result must be a context free grammar" I think what you mean to say is that the string language defined must be a context free (and non-regular) string language. But that does not in any way entail that the only way to recognize the string language is by means of a particular context free grammar.
What I meant by "the result will still have to be a context free grammar" was
a context-free grammar that incorporates a counting mechanism, as part of its definition. Something like this, perhaps (in Definite Clause Grammars notation so you can actually run it as a Prolog program):
'S' --> 'A'(N), 'B'(N).
'A'() --> 'A'.
'A'([1|As]) --> 'A', 'A'(As).
'B'() --> 'B'.
'B'([1|Bs]) --> 'B', 'B'(Bs).
'A' --> [a].
'B' --> [b].
That CFGs model so many languages so well is fascinating and puts constraints on the language faculty, but if some language happens to not use the full power of it, that shouldn't come as quite such a surprise. Semantics are not context-free and we know that the underlying brain mechanism is far more complicated than finite-state-machine-plus-stack.
A full study of Piraha would likely be illuminating if it really is as different from other languages as Everett says -- and it's practically useless to have him say only "It doesn't have feature X". Unless he can also show that a Piraha child can't learn another language -- that would really mean something.
The fact that linguists make a controversy out of it means I must be missing something, but I can't figure out what.
I generally agree; finding a language which opts not to use recursion would be a weird phenomenon, and might make certain claims about UG weaker. But it wouldn't invalidate all of traditional linguistic research for the past however many years or totally upend our current theoretical frameworks.
And even Everett didn't catch the bilabial trill affricate till 2004, like 20+ years in to his research.
― Tom Clancy
In case some of you are interested by the “merge” structure they mention, you can look for “Minimalist Program”, by Chomsky and others. That’s where it comes from:
I've worked with some clever animals, and agree they didn't show any signs of recursive embedding (they "lex" but don't "parse"), but must note that modern anglophone humans are relatively embedding-impoverished compared with prior centuries, a reduction which we can observe in comparing the number of stack levels necessary to parse the subordinations and coordinations in the dendritic periods of the 1st president of the US with the number necessary to parse the sequential utterances of the most recent.
> "In these honorable qualifications, I behold the surest pledges, that as on one side, no local prejudices, or attachments; no separate views, nor party animosities, will misdirect the comprehensive and equal eye which ought to watch over this great assemblage of communities and interests: so, on another, that the foundations of our National policy will be laid in the pure and immutable principles of private morality; and the pre-eminence of a free Government, be exemplified by all the attributes which can win the affections of its Citizens, and command the respect of the world."
So I needed recursion and a bit more grammar to make this hierarchical word fusion work better.
... exemplified by
all the attributes
win the affections
of its Citizens, and
command the respect
of the world.
(no ((local prejudices) or (attachments))
no (separate views)
nor (party animosities))
... will misdirect ...
as on one side ... so, on another, ...
While "carhouse" may not be a thing, you might be surprised to learn of the existence of a "car condo" (or "motor condo"): https://www.irongatemotorcondos.com/
Yes, really. This business not only exists, but is thriving.
This is such a journalist thing to write. The idea that the number of sentences that can be produced using a given language is potentially infinite is such an old one that it hardly has anything to do with Adger, and Adger hardly needed to "argue" it.
This is such a tantalizing idea, but there doesn't seem to be anyway to extend it beyond "we don't know what it would be like to think with a different language paradigm, but it would probably be different." I want to know how human thoughts are shaped and constrained by human language.
or Story of Your Life
Formal diagrams (still potentially unbounded combinations of a bounded symbolic repertoire, but connected in more than just a temporal dimension) are as close as I can think of at the moment.
Anyone know of more recent work on https://www.microsoft.com/en-us/research/wp-content/uploads/... ?
AI can do compositionality.