
There is no language instinct (2014) - prostoalex
https://aeon.co/essays/the-evidence-is-in-there-is-no-language-instinct
======
LesZedCB
I"m not sure this is completely related, but every time I think about Chomsky
and language and universal grammar, this comes to mind and still leaves me
baffled and amazed. This is a short 5 minute video on whistling Turkish. In a
village in Turkey, they uphold the tradition of communication by whistling
phonemes. It is a direct mapping of Turkish to Whistling Turkish, where every
phoneme in Turkish has a whistling equivalent. The difference is it uses pitch
changes as phonemes. It blows my mind.

[https://www.youtube.com/watch?v=bQf38Ybo1IY](https://www.youtube.com/watch?v=bQf38Ybo1IY)

~~~
mbrock
There's a similar tradition of whistled Spanish on the Canary Islands. It was
used to talk across mountain valleys and stuff. UNESCO designated it as some
kind of cultural heritage, I think.

~~~
panglott
Whistled language isn't all that uncommon: it also happens in some Inuit
languages and tonal languages. The even have different airstream mechanisms
IIRC: Inuit whistled languages use palatal whistling, whereas on Silbo Gomero
(pardon the spelling) you whistle into your hands.

Check out the Piraha channls, where they also have "hum speech", "yell
speech", "musical speech".
[http://itre.cis.upenn.edu/~myl/languagelog/archives/003175.h...](http://itre.cis.upenn.edu/~myl/languagelog/archives/003175.html)

------
simondedalus
this article is bad. chomsky and his ilk have always acknowledged the wide
variety of linguistic behavior; that's the whole point. universal grammar is
the parameters to be set (what sounds have meaning? what word order is used?
etc). learning a language is the setting of those parameters.

humans may have linguistic "knowledge" that they're born with, they may not...
but this article is beside the point.

~~~
rspeer
This article is very much on point when it comes to the language debate,
you're just taking an oddly hedged position on it.

Chomsky and his ilk have made very strong, very specific claims about language
in the past. They have claimed that certain combinations of linguistic
parameters are impossible, and that certain features are shared by all
languages. They have actively denied the wide variety of linguistic behavior
until the evidence piles up enough that they change the topic.

Followers of this brand of linguistics have resisted empirical tests of these
claims, and some of them have academically attacked those who disagree. And
yet these claims have been chipped away anyway until all that's left is a
really vague claim that most humans can communicate using recursion.

Your second paragraph seems at odds with your first. If these parameters exist
to be set, that's knowledge. Knowing that word order _is a thing that can
vary_ is knowledge. Chomsky says we are born with this knowledge. Some who
disagree with him say we aren't born with it, and that we have to learn it
from experience.

In the part where you hedge and say "they may not...", you're surprisingly
agreeing with this article more than Chomsky. Chomsky would say "Humans
absolutely have linguistic knowledge they are born with. Anyone who disagrees
is a behaviorist charlatan."

Chomsky's linguistics was indeed a way better model of learning than
behaviorism, but that was 60 years ago and we need a better model now.

~~~
emn13
I have no idea about the details of this debate, but even so it's obvious that
this article screams warnings signs - it's full of tricky strawmen and other
fallacies.

For instance, it makes the absurd comparison between the supposed innate
talent for language and a spiders web-weaving - spiders weave webs without
examples, so why not kids? Spiders are about as poor a stand-in for a human as
you'll find in the macroscopic world. Also, even if something is instinctual,
it's hardly reasonable to assume that instinct will _always_ overcome any
situation, regardless of trauma, especially in a animals as impressionable as
humans. Finally, just because an animal has an instinct and/or innate aptitude
doesn't mean that practice is irrelevant. The girl that was held in a cellar
until she was 13 only shows that either the instinct isn't absolute (no
surprise there), or even just that the instinct still requires exercise.

Then there's the flatly false statement that dna doesn't have the storage
capacity to store a universal grammar. Oh really? The human genomes codes for
on the order of a gigabyte of data. AFAIH, there's strong evidence that
there's _not_ a lot of evolutionary pressure to keep genomes small, IIRC based
on the observation that even small, competing microbes have dramatically
varying sizes of genomes (when you'd expect smaller-genomed competitors to
have won if it really mattered). For some context: A compressed wordlist
containing 350000 english words is less than a megabyte. In general, grammars
tend to be much smaller, especially if you consider that it's not claimed that
a full grammar is stored, but at most the "basics" that can get rehydrated
based on _learnt_ language. Another poster in this thread claimed his 500 line
python program can generate a good approximation of human language given
enough input data. I wouldn't be surprised if a few kilobytes would suffice to
describe the components from which any grammar can be built - much smaller
than the full grammar itself.

There's another absurd misunderstanding of evolution in which the article
manages to suggest that because evolution is necessarily continuous in the
space of genomes, it thus must be continuous in the space of outcomes (i.e.
language is so large a leap, it couldn't have evolved). I'm sure many
creationists feel the same way about the eye - not to mention the fact that
the article then proceeds to knock down some straw-men about when that
language ability might have evolved according to chomsky (which is basically
saying that because chomsky apparently doesn't know how language evolved, his
other theory must be false. Really?)

I have _no idea_ about the details here, but this article is about as
unconvincing as you can get. He may well be right - I'm not a linguist, and to
be frank I'm not entirely sure why the claim even matters (not a great sign
for a scientific hypothesis). But I'm sure this article isn't making a lot of
sense.

~~~
panglott
That discussion of evolution seems to me like he was paraphrasing and
criticizing Chomsky's ideas of language evolution. Chomsky has both argued for
an innate language organ and language acquisition device, and been critical of
attempts to explain such a thing in terms of evolution. It's a weird position
that the author is right to criticize (if a bit clumsily).
[http://itre.cis.upenn.edu/~myl/languagelog/archives/002134.h...](http://itre.cis.upenn.edu/~myl/languagelog/archives/002134.html)
[http://www.nybooks.com/articles/archives/1995/nov/30/genes-m...](http://www.nybooks.com/articles/archives/1995/nov/30/genes-
memes-minds/)

------
oxplot
Hmm, given that a few hundred line python script fed with large stream of
characters can start producing sentences with correct grammar (however mostly
nonsensical), I agree that we, given the complexity of our neural networks,
don't need any pre-existing machinery to learn languages.

~~~
emn13
But in order to do that, the "model" (i.e. the python program itself) needs to
be carefully selected. Most python programs - or other programs - won't do
this. Indeed, you need to construct the program to pay special attention to
particular kinds of statistics, and the choice of those structures would
correspond to the hypothetical "universal grammar" \- at least, as far as I
understand without being overly encumbered by any expertise on the matter ;-).

~~~
oxplot
I wrote the parent comment on the go so here's more details. I'm specifically
referring to the python script [1] mentioned in the article titled "The
Unreasonable Effectiveness of Recurrent Neural Networks" [2]. I don't know
much about ANNs but after a quick glance at the code, it seems to be setting
up a generic three layer RNN. Based on this, I would argue that ability to
learn patterns in sequence of data is all that's needed to learn a language.
Calling it a "grammar" ties it too much to concept of language since such
learning ability extends beyond human languages.

[1]:
[https://gist.github.com/karpathy/d4dee566867f8291f086](https://gist.github.com/karpathy/d4dee566867f8291f086)

[2]: [http://karpathy.github.io/2015/05/21/rnn-
effectiveness/](http://karpathy.github.io/2015/05/21/rnn-effectiveness/)

~~~
YeGoblynQueenne
Chomsky based his original argument that language is an innate ability of the
human species on a famous mathematical result by Mark E. Gold, from a paper
titled "Language identification in the limit" published in 1967 [1]

Gold's result basically says that it's impossible for a learner to learn a
non-finite language (which means anything from regular languages and above)
without access to an infinite number of examples, both positive and negative,
of the language, or failing that, access to an oracle that knows the language
perfectly and can confirm or reject the learner's evolving model of the
language. In some versions, the oracle must respond to errors with a
_negative_ counter-example, teaching the learner why its model is wrong ("if
the sentence you just formed was correct then you could form this other
sentence, which is obviously wrong").

As far as I know, Gold's result still stands. So I think you may be
overestimating the power of the python script you link to.

I should say that I'm aware of Karpathy's project, and it's really cool (or
awesome if you prefer), but reproducing text is not the same as learning
language.

In other words, if you expect any program to learn language, you better have
infinite data to train it with and also an infinite amount of time. I stress
that this doesn't only apply to human languages. You can try teaching a
network to learn the syntax of Java, for example. You'll still need infinite
resources, or an Oracle.

++ Edited to remove unfortunate scare quotes and also to add: if you could
machine-learn a language, even "just" a CFG, we wouldn't need to write parsers
for programming language compilers anymore. It would suffice to present a
machine learning algorithm with some examples (a lot of them probably) and the
algorithm would form a model of the language, that we could then use as a
parser.

This is far from the state of the art today, however.

__________________

[1]
[http://web.mit.edu/~6.863/www/spring2009/readings/gold67limi...](http://web.mit.edu/~6.863/www/spring2009/readings/gold67limit.pdf)

------
hasenj
I've been thinking lately that our brains might have evolved for communication
using language. The fact that they can also do philosophy, math, science and
engineering is just an unintended (yet fortunate) consequence of having enough
capability to handle language.

Learning a foreign language can be as challenging as learning to program or
building a software system. It takes a lot of dedication and practice.
Involves using a lot of memory and pattern recognition, and all the
"intellectual" capacities that are needed to handle a large engineering
project.

------
mitchtbaum
I only scanned through this very long article to pull out useful info.. I got
this:

> [Chomsky] concluded that [children] must be born with a rudimentary body of
> grammatical knowledge – a ‘Universal Grammar’ – written into the human DNA.

> ... Chomsky is plain wrong

> What is in dispute is the claim that knowledge of language itself – the
> language software – is something that each human child is born with.

> Our brains really are ‘language-ready’ in the following limited sense: they
> have the right sort of working memory to process sentence-level syntax, and
> an unusually large prefrontal cortex that gives us the associative learning
> capacity to use symbols in the first place.

> And of course, language doesn’t need to be spoken. ..linguistic meaning can
> be conveyed in multiple ways: in speech, by gestured signs, on the printed
> page or computer screen. It does not depend upon a particular medium for its
> expression.

> As it happens, cognitive neuroscience research from the past two decades or
> so has begun to lift the veil on where language is processed in the brain.
> The short answer is that it is everywhere.

> Why is it that today, only humans have language, the most complex of animal
> behaviours?

> [Early humans'] new ecological situation would have led, inexorably, to
> changes in human behaviour. ... This allows us to picture the emergence of
> language as a gradual process from many overlapping tendencies.

> We see this instinct [(cooperation)] at work in human infants as they
> attempt to acquire their mother tongue. Children have far more sophisticated
> learning capacities than Chomsky foresaw. They are able to deploy
> sophisticated intention-recognition abilities from a young age, perhaps as
> early as nine months old, in order to begin to figure out the communicative
> purposes of the adults around them. And this is, ultimately, an outcome of
> our co‑operative minds.

~~~
Retra
I always like to frame this kind of thing using the term "appropriateness."

We can imagine an amoeba having a rudimentary ability to detect whether its
surroundings are hostile or not and employ an instinctual response to evade
it. It's not much of a leap from there to say humans have a social instinct,
and that our ability to evade hostility has become strongly correlated with
our ability to identify behaviors that are appropriate in context.

And language is, in an abstract sense, just this: choosing appropriate
actions. Maybe they're words. Maybe they're intonations. Or other social
norms. Nobody learns 'just' language as a child, they learn a whole way of
life. They learn "swearing is wrong," not simply "this is the proper syntax
for swearing." And they _feel_ its wrongness when they swear. You can't know
what swear words mean if you can't feel that, and you probably won't be able
to survive very well as a human if you can't determine what behavior is
appropriate or not in your situation.

------
GarvielLoken
It's not about the brain, it's about the body. Right brain innate connection
to ones own body and to the world. We know that there are nouns because the
body senses objects. We know there are verbs because the body can make
movements. We know there are adjectives because the objects differs. It's a
very left brain dominated view that would even contemplate that a language
system would evolve free of interaction with the world, only through dna
construct some language organ.

And language is probably a product of singing. Checkout "The Master and His
Emissary: The Divided Brain and the Making of the Western World". Singing is
used to share feelings in larger groups then ordinary grooming permit.
Language is a systematization of singing. That is why music evokes feeling
response more readily then words in ordinary speech, speech evolved later.

------
akkartik
My _favorite_ theory of the origin of language: [http://www.amazon.com/The-
Symbolic-Species-Co-evolution-Lang...](http://www.amazon.com/The-Symbolic-
Species-Co-evolution-Language/dp/0393317544). It unfolds a bit like a murder
mystery, so I won't spoil the ending for you. If the middle section seems a
bit of a slog -- persevere!

------
igravious
"the chances of two individuals getting the same chance mutation, at exactly
the same time, is even less credible. And so, according to the theory of the
language instinct, the world’s first language-equipped human presumably had no
one to talk to."

This is why I believe that if there are any genetic encodings for language
(whether either specific to language alone or non-specific but used also by
language, if you get my drift) they must have occurred with twins who were
then able to communicate more effectively with each other relative to their
peers and so they survived and so did their offspring and their genes spread
throughout the gene pool. I know it's a wacky idea but it gets around the
rebuttal of "presumably had no one to talk to."

And sure, it's an unlikely event but we're talking hundreds of thousands of
years here, thousands of generations, billions of people.

------
Xophmeister

        Ctrl+F Everett
    

...I thought as much

~~~
wodenokoto
Mind sharing what the impact of the result of that search is supposed to be?

~~~
Pitarou
Daniel Everett is an anthropologist who has made some extraordinary claims
about an Amazonian tribe called the Pirahã.

~~~
panglott
OK... but so what? If you're doing a general interest article on debates in
linguistics in the last 10 years, you're going to end up covering Everett's
work and its implications for Chomsky's theory du jour. This article's
discussion of Everett and recursion was pretty brief, overall.
[http://chronicle.com/blogs/linguafranca/2012/03/28/poisonous...](http://chronicle.com/blogs/linguafranca/2012/03/28/poisonous-
dispute/)
[http://itre.cis.upenn.edu/~myl/languagelog/archives/004592.h...](http://itre.cis.upenn.edu/~myl/languagelog/archives/004592.html)

