
Neuroscientists find new support for Chomsky’s “internal grammar” thesis - tompark
http://www.nyu.edu/about/news-publications/news/2015/12/07/chomsky-was-right-nyu-researchers-find-we-do-have-a-grammar-in-our-head.html
======
MrQuincle
I think we also have visual grammars. We are surprised if heavy and large
objects are on top of light and small objects. We have priors on lighting and
shadows, depth and occlusion, inside and outside.

We solve these statistically with complex priors and likelihoods. How much you
can shove into the prior is the question of universal grammar. That you are
born with linguistic priors is I think not so surprising. If they are of the
old-fashioned symbolic variety that Chomsky proposes, would be surprising.
It's way too "clean" for my liking, personal opinion!

~~~
cmarschner
I'm not sure about that. A newborn first learns to recognize shapes, then
starts exploring its environment using hand-eye coordination. It learns the
rules of gravity by watching and feeling. It has a few inborn reflexes, like
looking for the source of milk right after birth, but the physics engine seems
very much a learnt thing. Also it seems the fundamental physics laws can
easily be learnt by a system like the brain (we can even do it with artificial
neural nets). Visual cortex seems to be similarly simple. Only for
understanding language we have been missing a couple of pieces to the puzzle.
That seems still like a miracle. Disclaimer: anecdotal evidence.

~~~
lawpoop
Since most babies tend to follow the steps of development according to a
timeline (i.e. a, then b, then c, and never out of order), and typically on
time, isn't this rather evidence that such neural system grow more than the
physics system is learned?

The idea that some complex systems can simply grow without inputs is not
unfounded in the biological world -- witness, for example, grazing animals
such as cows and giraffes that plop out of the birth canal, shakily stand up,
and being trotting around. They couldn't have trained that system in the womb.

Similarly, human babies who are growing complex neural systems over years
might look like they are learning, but instead they are just growing. That's
not to say that we don't learn things later on-- certainly we do-- but that's
not evidence that babies are necessarily learning all the skills they become
capable of in toddlerhood.

~~~
mladenkovacevic
The way you formulated that is fascinating and made me understand the whole
debate a little better. The strict order in which we grasp things does indeed
indicate that these basic skills are perhaps not left to chance waiting until
the correct input is provided, but are hardwired into the process that
develops our brains.

I guess the Chomsky debate is whether grammar is one of those basic skills.
Would two children left unsupervised by adults through their formative years
construct their own language that surpasses the communication exhibited by
gorillas for example?

~~~
lawpoop
This has actually happened-- caveat, with deaf people (but sign language does
have grammar)-- check out Nicaraguan Sign Language. Pinker has a good chapter
on it in The Language Instinct.

------
pazimzadeh
> Their results showed that the subjects’ brains distinctly tracked three
> components of the phrases they heard, reflecting a hierarchy in our neural
> processing of linguistic structures: words, phrases, and then sentences—at
> the same time.

How are they distinguishing between grammar that's encoded in the brain due to
"nature" vs grammar that's encoded in the brain via "nurture"? I think we all
knew that the brain has some mechanism to detect grammar.

~~~
soheil
Your assumption is that all information necessary and sufficient to decode
grammar into its constituents is detectable from the spoken words that a child
hears. According to Chomsky there is a distinction between the I-Language
versus E-Language [1] (internal/ex) namely because grammar in most human
languages is not the most computationally efficient way of structuring
sentences hence the need for an uniquely human/innate ability [2].

This is also why monkeys can never be trained to learn human language [3].

[1]
[https://en.wikipedia.org/wiki/Transformational_grammar#.22I-...](https://en.wikipedia.org/wiki/Transformational_grammar#.22I-Language.22_and_.22E-Language.22)

[2]
[http://www.theatlantic.com/technology/archive/2012/11/noam-c...](http://www.theatlantic.com/technology/archive/2012/11/noam-
chomsky-on-where-artificial-intelligence-went-wrong/261637/)

[3]
[https://en.wikipedia.org/wiki/Nim_Chimpsky](https://en.wikipedia.org/wiki/Nim_Chimpsky)

~~~
yqx
I don't think any assumption was made here. It is as much an assumption to say
that all information necessary and sufficient to decode grammar into its
constituents is __not __detectable from spoken words that a child is exposed
to. That 's a theoretical argument. Just because we can't think of a way this
could be done, doesn't mean it can't be done. At the moment, we simply don't
know.

I think what the comment you're responding to meant is that this study shows
no evidence for an innate internal language or grammar, which I thought was
Chomsky's most controversial claim.

------
mindcrime
This appears to be the paper in question:

[http://psych.nyu.edu/clash/dp_papers/Ding_nn2015.pdf](http://psych.nyu.edu/clash/dp_papers/Ding_nn2015.pdf)

Also, for more background on the idea of the "Universal Grammar":

[https://en.wikipedia.org/wiki/Universal_grammar](https://en.wikipedia.org/wiki/Universal_grammar)

~~~
dlwh
Not a neuroscientist, but I do do NLP, and I only lightly skimmed the paper.

This doesn't really speak to UG.

First, you can believe in the structures they purport to show without
accepting the existence of UG, by appealing to the existence of general
mechanisms in the brain for assembling hierarchical structures, which is
equally validated by this experiment.

Second, they looked at two languages with sentences of up to ~7 syllables each
with at most two constituents (Noun Phrase Verb Phrase). You can't show any
evidence for any hierarchy of interest in 7 syllables. They demonstrated that
phrases exist and phrase boundaries exist, but it's entirely possible to have
"flat' grammars without interesting hierarchy, especially in simple sentences.
If they want to show interesting hierarchy, they should conduct experiments
with more interesting structure (say, some internal PPs and some limited
center embedding) and show something that correlates with multiple levels of
the "stack" getting popped, or something.

It's still interesting work, but as usual oversold by the university press
office.

~~~
mindcrime
_First, you can believe in the structures they purport to show without
accepting the existence of UG, by appealing to the existence of general
mechanisms in the brain for assembling hierarchical structures, which is
equally validated by this experiment._

That was kinda my impression as well, but I don't want to say much more as I'm
so far from an expert on this and I'll probably just make an idiot out of
myself. Still, as you say, it is interesting work in its own right.

------
kazinator
I suspect we have a "grammar" which is based on massively parallel pattern
matching over a sequence of symbols of bounded length. (I.e. why we deal with
ambiguities well, but not long sentences.)

~~~
logicallee
if it's so massively parallel, why do I read your comment left to right one
word at a time? why garden path sentences that mislead us as we build up an
erroneous parse? sure there's some fuzzy pattern matching, but lexical parsing
doesn't seem so massively parallel to me - more like a huge lookup table of
fuzzy matches (including whole phrases, I don't have to say much about how I'm
never gonna give you up, before that clicks for you), while we're also be
parsing lexically with quite a formal grammar.

I put a grammar error into the last part of the last sentence purposefully,
after I wrote it so you can read it and notice the error explicitly. (Whereas
I would expect you to still figure out what I meant.) If it were just pattern
matching that grammatical error wouldn't bother you at all even if you notice
it, you would just go with the most likely fuzzy interpretation.

~~~
vidarh
> If it were just pattern matching that grammatical error wouldn't bother you
> at all even if you notice it, you would just go with the most likely fuzzy
> interpretation.

I actually had to read the sentence about 3 times, and force myself to read it
slowly word by word, before I spotted it.

We often _don 't_ read one word at a time.

I agree with you that "massively parallel" is probably drastically overstating
it, though.

~~~
andreasvc
> "massively parallel" is probably drastically overstating it, though.

Think of a single-core processor executing instructions one at a time. Now
think of ten billion neurons in your brain out of which a large proportion is
active at any time. That is definitely massively parallel.

~~~
vidarh
At that level, yes. But those neurons are not doing high level symbol
recognition tied to a grammar.

This is like pointing to the billions of transistors in a typical modern CPU
and claiming it is massively parallel despite being single core.

For many things the brain clearly can do multiple things in parallel (I can
write one thing while talking about something else, for example). But it most
certainly can't usually carry out many high level tasks at the same time.

When it comes to things like recognising words/symbols the question is how
high level that actually is, and whether or not the brain has capacity for
recognising more than one at the same time, and if so how many.

------
yongjik
> Neuroscientists and psychologists predominantly reject this viewpoint,
> contending that our comprehension does not result from an internal grammar;
> rather, it is based on both statistical calculations between words and sound
> cues to structure.

Wait, is this true? There are people who seriously suggest that we don't have
an innate notion of grammar?

~~~
gliese1337
Yes, there are. Or at least, that we don't have a special-purpose module of
our brain devoted just to that. The common counter to Chomsky's version of
innate grammar is that language is the result of the interaction of a bunch of
different cognitive modules which all evolved for different purposes, but
which when combined happen to make language possible, and have since been
under selection pressure to make language work better.

The article doesn't actually provide any support for the "innate" or
"universal grammar" hypothesis as it is typically understood, though. This is
really about whether the de-facto grammar that emerges from whatever the
"language faculty" has a particular hierarchical and recursive structure. And
yes, there are people who seriously argue against that, too. The most common
alternative model that I have been exposed to is analogical template matching,
which actually does a pretty good job at a lot of NLP tasks.

~~~
yongjik
Well, OK, I could understand a position saying "We don't have a special-
purpose module of our brain devoted to grammar," which is (I think) a direct
opposite of Chomsky's position. I'm not a neuroscientist, so I don't really
know how much we're close to answering this.

However, that is very different from the quoted passage. It sounds like these
"neuroscientists and psychologists" don't think our brain has any notion of
"grammar" at all, and everything can be explained better with statistical
relations. I find that very hard to accept, and I wonder if a more reasonable
position (= there is no _special_ part of brain reserved for learning grammar)
was summarized poorly into a much stronger hypothesis.

Well, of course, if we reduce everything as much as we like, then we could say
everything is about statistical relation of a gazillion variables (or quantum
mechanics, if you go far enough). But that's not useful, isn't it? Even if we
cannot pinpoint a neuron and say "This neuron will fire if it sees a past
perfect!", if the brain as a whole acts like it has an underlying notion of
grammar, then I think it's fair to say that it "knows" grammar.

And, forgive me if I'm wrong, but I'm pretty sure that my brain acts like it
has some notion of English grammar. :P

Edit: Hmm, I think I caused confusion by using the word "innate". Apparently
it can mean (1) existing from the time a person or animal is born, and (2)
existing as part of the basic nature of something.

Chomsky proposes (1), and many disagrees. (I don't find Chomsky's arguments
particularly convincing, either.) But the sentence I quoted (which didn't use
the word "innate") sounds like it was against (2).

~~~
brenschluss
Let's say that you are very good at basketball, enough so that you can almost
always make a shot from anywhere on the court. Your mind is computing, based
off of visual feedback, how to fire your muscles in such a way so that the
basketball makes a near-perfect parabola and makes it into the net.

Does your brain "know" the formulas that describe projectile motion? Is it
running a physics model in your brain?

~~~
semi-extrinsic
A very important difference: someone who is good at throwing a basketball (or
any sport, really) will "switch off" their brain while doing so. There's
various words for this, e.g. "being in the zone". Ask the person to think
about something while throwing, and they'll likely miss. It takes years of
training to teach the brain to do the correct motions while " switched off".

Using language, on the other hand, requires us to really think.

~~~
mbrock
Language doesn't seem to require conscious thought either. Philosophers like
Heidegger and Wittgenstein talked about that a lot. The postulating of an
internal thought process "behind" everything is basically what Heidegger spent
his life refuting, from what I've read of Dreyfus's book... And Wittgenstein
explains language as a tool that people use to get things done, more or less.
You can talk without "really thinking", people do it all the time. Grammar is
natural in that sense. Pinker's "The Language Instinct" mentions studies that
show higher educated people in some situations tend to make more grammatical
blunders and to be slightly less "fluent," maybe because of second guessing
their language instincts.

(Any errors in this comment due to the fact that I wrote it without thinking.)

~~~
semi-extrinsic
I must admit I was thinking more about the mental process of constructing the
content of e.g. an argument than of the process of putting the content into
grammatically correct sentences. I completely agree you can talk, and even be
gramatically correct, without really thinking.

------
lorenzhs
pdf link, because somehow University press releases never contain references
to the actual paper:
[http://psych.nyu.edu/clash/dp_papers/Ding_nn2015.pdf](http://psych.nyu.edu/clash/dp_papers/Ding_nn2015.pdf)

------
DrScump
HN discussion from 6 months ago upon the 50th anniversary of the theory:

[https://news.ycombinator.com/item?id=9762001](https://news.ycombinator.com/item?id=9762001)

------
_0ffh
Well, sooner or later it'll turn out it's all mostly that we're wired to learn
grammars; and that we're biased towards learning some types of grammars better
than others. (I think evidence has been plenty, even before this.)

That's not quite the same as how some people seem to understand Chomsky, a
kind of grab-bag of hardwired grammar rules tucked away in some strand of DNA.
But still, there it is.

------
voidhorse
As someone who is a fan of the generative grammar tradition, that's pretty
cool!

Going to have to read the full study later, but it sounds like they have some
decent results from that press release (though not total evidence for the
thesis).

Hopefully this finding can contribute toward pushing studies in neuroscience,
psychology, and linguistics further.

Has some implications for a computational theory of mind too.

------
palosanto
"Neuroscientists and psychologists predominantly reject this viewpoint,
contending that our comprehension does not result from an internal grammar;"

Fascinating article, highly recommended. Not trying to be nitpicky but I'm
going to need a citation or few to support some of the claims made about
"neuroscientists and psychologists"... just saying.

~~~
phphphph
[Here might be a good
start]([https://en.wikipedia.org/wiki/Syntactic_Structures#Criticism...](https://en.wikipedia.org/wiki/Syntactic_Structures#Criticisms))

------
ivan_karamazov
Math, computation, grammars, are just models, syntax, that we sometimes use to
consciously express things about the world. We could assign syntax to a system
composed of whatever thing and say that is Turing-complete or that follows the
productions A->B and B->BB of G_1, an imaginary grammar, just because we can
assign to it certain symbols as inputs and others as outputs. Numbers,
symbols, don't exist in the real world. Math entities only exist as syntax, as
human artifacts. They're language entities. That's why they're based on
axioms.

I don't see the point in saying that syntax is physical - I mean, the neurons
firing in our brains because of photons getting in our eyes explain how I
recognized the shape of a car, not that a 'hypothetical 2D array' gets
'processed' with some 'gradient-based algorithm' unconsciously in our brain.
It doesn't make biological sense. Turing didn't even defined what a symbol or
a computation is in terms of Physics, he just made a (very cool and beautiful,
by the way) model.

Chomsky is simply wrong. He's just thinking here that language is suddenly
physical, that we have a grammar-processing CPU in our brains, a form of a
Universal Turing Machine! Can this number, '1', be physically somewhere in the
world, taking in account modern physics and not some weird Platonism? or we
just use it as language to refer to physical entities to talk about them? I do
think the latter is the truth. Grammars are math entities which we sometimes
use to refer to certain things written in books or other spoken things that
just happen to have some physical shape in form of sound waves (words) or even
to refer to an imaginary entity we've created with math, just like abstract
machines. I don't think there's some weird computational thing going on in my
head, that's just an oxymoron. Just neurons firing, blood being pumped,
neurotransmitters being sent, in all its glorious and mysterious nature. The
mind, on the other hand, exists as a high-level feature of some parts of the
bigger brain system, it's caused by real things; that mind that lets us do
conscious calculations. But that 'Universal Grammar' wouldn't be inside my
mind anyway because it's supposed to be unconscious. It wouldn't be a feature
of the mind-spanning parts of the brain anyway.

In the end, it's just the whole Nietzschean critique of mistaking language
with reality all over again.

I cannot recommend John Searle's work on the subject enough, like the book
'The Rediscovery Of The Mind'. He explains all of this way way better than me.

P.S: Yes, if you think hard about it, this logically implies that a Strong AI
won't ever exist. Sorry, Skynet!

P.S.2: Sorry for the long post and possible grammatical mistakes, I'm not a
native. And I sincerely hope you enjoyed this post even if you don't agree
with me. Have a nice day!

------
PhaseMage
The study relies on isochronously spoken words!

Inside joke: I recently published a network protocol that relies on sending
words isochronously between switches.

I'll take this as validation that I'm on to something :-)

