Hacker News new | past | comments | ask | show | jobs | submit login

It can be the case if Chomsky was right, and Universal Grammar and other similar structures are a thing. That would mean that part of our ability to understand language comes from the particular structure of our brain (which everyone seems to by and large share). That would mean that some of our ability to understand language is genetic in nature, by whatever means genes direct the structure of brain development.



So if language comes from the structure of the brain, what would stop us from simulating that structure to give a machine mastery of language? And specifically what would imply that a machine which had some of that structure would need to learn by interaction as the top level comment suggests?


Nothing would stop us from simulating human brain-like (or analogously powerful) structures to build a machine that genuinely understands natural language. I'm arguing that we can't just learn those structures by statistical optimization techniques though.

If it turns out that the easiest, or even only means of doing this is by emulating the human brain, then it is entirely possible that we inherit a whole new set of constraints and dependencies such that world-simulation and an emobdied mind are required to make such a system learn. If this turns out not to be the case, that there's some underlying principle of language we can emulate (the classic "airplanes don't fly like birds" argument) then it may be the case that text is enough. But that's in the presence of a new assumption, that our system came pre-equipped to learn language, and didn't manufacture an understanding from whole cloth. That the model weights were pre-initialized to specific values.


If there is an innate language structure in the brain then we know that it's possible to develop such a structure by statistical optimization, since this is exactly what evolution did, no?


But I don't see any reason a "universal grammar" couldn't be learned. It may take something more complicated than ANNs, of course. But it would be really weird if there was a pattern in language that was so obfuscated it couldn't be detected at all.


it comes down to the limits of available Information with a capital 'I'. If you're working within the encoding system (as you're recommending here with the "all the text in the world" approach), then in order to learn the function that's generating this information, the messages that you're examining have a minimum amount of information they can convey. There needs to be enough visible structure purely within the context of the messages themselves to make the underlying signal clear.

I don't think it's so weird to imagine that natural language really doesn't convey a ton of explicit information on its own. Sure, there's some there, enough that our current AI attempts can solve little corners of the bigger problem. But is it so strange to imagine that the machinery of the human brain takes lossy, low-information language and expands, extrapolates, and interprets it so heavily so as to make it orders of magnitude more complex than the lossy, narrow channel through which it was conveyed? That the only reason we're capable of learning language and understanding eachother (the times we _do_ understand eachother) is because we all come pre-equipped with the same decryption hardware?


Nah, it's mostly just hierarchical probability models.

http://www.ncbi.nlm.nih.gov/pubmed/24977647


This is a very neat paper, but:

1) They appear to have crafted the skeleton of a grammar as it is with their nodes, super nodes, and slot collocations. This is directly analogous to something like an Xbar grammar, and is not learned by the system; therefore, if anything, it's strengthening a Chomskian position; the system is learning how a certain set of signals satisfy its extant constraints.

2) The don't appear to go beyond generative grammar, which already seems largely solvable by other ML methods, and is a subset of the problem "language". Correct me if I'm wrong here, it's a very long paper and I may have missed something.





Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: