
Toddlers’ grammar skills not inherent, but learned, new Stanford research says - stared
http://news.stanford.edu/2017/02/22/new-research-toddlers-grammar-skills-learned-not-innate/
======
stared
See also: On Chomsky and the Two Cultures of Statistical Learning by Peter
Norvig: [http://norvig.com/chomsky.html](http://norvig.com/chomsky.html)

~~~
lngnmn
This basically postulates that language learned is a process similar to a
supervised machine learning algorithms, and this, perhaps, is a more realistic
view that Chomsky has. The machinery has been evolved, but "the training data"
is required to make it work.

~~~
ubernostrum
Be careful of the trap: for quite a while now, each generation has simply
treated it as "realistic" to match the brain's workings to that generation's
popular new computational technique.

------
lngnmn
It is learned because the brain has specialized circuitry to be able to learn
it.

The machinery obviously has been evolved and is inherited, while a grammar of
a particular language has been learned.

There is also a strong hypothesis that the structure of the specialized areas
of the brain "reflects" the general notions of the shared environment, such as
things, attributes, processes which corresponds to nouns, adjectives and
verbs.

Like visual cortex has its own hard-wired heuristics and presuppositions (that
use environmental cues such as shadows, distances, illumination) which are
used in the process of scene decomposition and interpretation, the language
area should have its own hard-wired heuristics, which, presumably, are used in
grammar learning.

The knowledge (heuristics) are encoded in the structure, not in some kind of
information.

~~~
_dps
> The knowledge (heuristics) are encoded in the structure, not in some kind of
> information.

When you're talking about neural computation, distinguishing between an
evolved network structure/architecture and "information" (or at least biases
toward certain kind of information) is not very useful. The nature of these
systems as computing substrates is that they encode information in network
structure and rules for updating that structure.

~~~
lngnmn
> they encode information in network structure and rules for updating that
> structure

I think use of the term 'information' is not accurate. There are no bits and
the whole information theory, it seems, is synthetic and has nothing to do
with how the brain actually works. It is not a digital machine but electro-
mechanical, if you wish.

~~~
_dps
Respectfully, this is begging the question :)

You're correct that neural networks don't encode information in the way
current digital machines do (i.e. with explicit records/files/etc encoded in
bytes). I would disagree that this is the only useful view of information. And
certainly, from a Shannon information perspective, the network structure
contains information (in the Shannon sense) about the environment or problem
for which it was optimized.

~~~
lngnmn
> the network structure contains information (in the Shannon sense) about the
> environment

... the way a map represents a territory instead of how a guide book does ;)

Or the way a 3D structure made out of the same 20 joined amino-acids could
represent either an enzyme (the code) or a protein (the data) and that there
is no distinction between code and data (which is the great insight realized
in the original Lisp. It is not a coincidence that Lisp was the language or
the classic AI. Sorry for the reminiscence - just love this part).

------
skyisblue
It is said that babys who are spoken to in multiple languages speak later than
those who are spoken to in just one language. Could this extra 'learning'
cause the delay in speech?

------
dpark
How is this not obvious? Who could claim that grammatical understanding is
innate given that grammar is tied to the language and the language itself is
learned? Studying how grammar is learned certainly seems like a valuable
endeavor but I don't understand the "debate" over whether it's innate or
learned.

~~~
dmreedy
Like most ticklish philosophical problems, I think it becomes less obvious the
deeper in to the problem you look.

There are a number of considerations at play here, and a long history of
violently disagreeing about them.

Is language tied to grammar? Or is grammar tied to language[1]? There has been
a lot of work seeking to examine the common underpinnings of languages that
seem otherwise disparate, and a somewhat flexible core grammatical structure
has been proposed as one of these unifying components. The speculative step
some take next (including people like Chomsky) is that the reason that all of
these common patterns appear is that the brain has evolved some specific
hardware that serves as the basis for a set of communication protocols that we
call natural languages, and all natural languages instantiate this basic
hardware in different ways[2].

If you think this kind of hardwiring is possible, then another obvious-but-
not-obvious question follows. Can language really be _learned_ [3]? Can you
take a clean slate of neurons (some arbitrary configuration of connections)
and, given enough input data, end up with a system that understands language
(to frame it as a machine learning problem, which it may or may not be)? Or is
it closer to a game of connect-the-dots; the bare framework is there, built
into the way the brain arranges itself over the course of millions of years.
Input data is just required to figure out exactly what the final arrangement
of the picture is, whether it looks like English or Russian or Mandarin.

The current mood in cognitive neuroscience academic circles seems to be the
former case, that you can 'simply' throw enough data at a general-purpose
learning model and get a language-speaker out. Partly due to the success to
deep learning approaches in what were previously very difficult problems in
NLP. But I don't think it's fair to say there's nothing worthy of debate here.

\---

[1] Or is this a false dichotomy and there's some even weirder relationship
between the two.

[2] For a primitive example, languages with SVO ordering vs. SOV ordering.
Different instantiations of the same triple with subjects, objects, and verbs.
Ray Jackendoff took this another step with formulations like XBar grammar,
which was trying to demonstrate how a universal, parameterized grammar might
look.

[3] Grice had a fun theory that the observable signal of language is _so_
noisy that it's impossible to learn outright, regardless of the number of
examples you're exposed to. And thus, everyone speaks a slightly different
language, and some sets of these idiolects are, roughly, mutually compatible.
I don't think he ever did any math to demonstrate this though.

~~~
dragonwriter
> Grice had a fun theory that the observable signal of language is _so_ noisy
> that it's impossible to learn outright, regardless of the number of examples
> you're exposed to. And thus, everyone speaks a slightly different language,
> and some sets of these idiolects are, roughly, mutually compatible. I don't
> think he ever did any math to demonstrate this though.

I'm not sure how you'd reduce this to something mathematically demonstrably,
but it seems to me that it's the only thing that _could_ be true. The
alternatives is that languages are real, distinct, concrete things that exist
independent of people, rather than arbitrary lines drawn by people around the
continuous and evolving variation in ways people communicate.

~~~
dmreedy
So, the only hunch I have is that there might be a kind of
cryptographic/information approach to it. If you treat natural language as a
noisy encoding of some underlying set of intents (whatever sentience is), then
the question becomes "is there enough Information in the signal (natural
language) to accurately model the underlying intents statistically?". If the
answer is 'no', or 'yes, but very badly', then it seems likely to me that
there's some shared hardware/firmware (in this framing, the functional
equivalent of a decryption key) between people. If the answer is 'yes', or
'yes, mostly at least', then there's no need for any kind of universal grammar
or such, and you can learn language from a blank state. Basically, is natural
language something like a one-time pad, or is it more like a breakable form of
encryption?

Of course, there are plenty of assumptions in that model (for one, can you
even separate 'language' from 'sentience' like an information model would
suggest. I'm not so sure you can, but then I've always been somewhat
sympathetic to Sapir-Whorf). I'm also not a competent enough mathematician in
these fields to see where all the holes in this line of thinking are.

------
multinglets
How long is the one side of this thing going to talk past the other?

------
nyrulez
I am surprised this required research. How is this not obvious ? What's next ?
Algebra skills and essay writing? You could pretty much do this research for
every thing we ever learn from school to college and publish endless amount of
papers.

~~~
tn13
No this is not obvious. Throughout human history and even today algebra,
writing and reading are hard skills for humans which require years of training
to master.

Ability to speak however comes very naturally without special training. Human
societies independently developed isomorphic grammar. For example Navajo
Indians helped American soldiers translate their messages into Navajo and back
so Germans wont understand it.

This sort of evidence lead scholars like Chomsky to believe that human brain
might have some in-built ability aided by evolution to parse some basic
grammatical rules.

~~~
dpark
> _Ability to speak however comes very naturally without special training._

It literally takes years of practice for children to speak well. Maybe it's
natural but we tend to dismiss the difficulty because it happens early.

I would think if we had dedicated hardware for grammar subsequent languages
wouldn't be so difficult to learn either. It's easier to learn algebra than a
second language.

~~~
dilemma
Children learn language without instruction. That is all. Clearly, there is
innate linguistic capability in humans.

~~~
delinka
My granddaughter (1y 8m) uses flossers without instruction. Clearly, there is
innate tooth hygiene in humans.

No, what's happening is she's mimicking the adults in the house. She's
observing and repeating. She's done the same thing with dolls: put them on her
shoulder, pat them, make shushing noises ... because we do that with her;
because she see's others do that with other babies.

"Instruction"? No. But the only "innate" behaviors this child has involve
eating and curiosity. Everything else is spurred by that innate curiosity.

