Hacker News new | past | comments | ask | show | jobs | submit login

This is a very neat paper, but:

1) They appear to have crafted the skeleton of a grammar as it is with their nodes, super nodes, and slot collocations. This is directly analogous to something like an Xbar grammar, and is not learned by the system; therefore, if anything, it's strengthening a Chomskian position; the system is learning how a certain set of signals satisfy its extant constraints.

2) The don't appear to go beyond generative grammar, which already seems largely solvable by other ML methods, and is a subset of the problem "language". Correct me if I'm wrong here, it's a very long paper and I may have missed something.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact