
A Review of the Neural History of Natural Language Processing - homarp
http://blog.aylien.com/a-review-of-the-recent-history-of-natural-language-processing/
======
prions
This is a great article. One big thing NLP lacks (soon enough, lacked) over
computer vision is its own version of ImageNet and other similar off the shelf
models that can be fit to different domains. Many NLP models are brittle due
to the disparity between tasks and problem domains. My BiLSTM-CRF model for
NER would require retraining on a completely different set of labels in order
to run inference on other tasks.

Elmo embeddings are especially interesting. Dynamic Bernoulli embeddings also
deserve a mention.
[https://arxiv.org/pdf/1703.08052.pdf](https://arxiv.org/pdf/1703.08052.pdf)

------
gjstein
For those who don't know, this article is by Sebastian Ruder, who has a great
blog [1]. His posts are often pretty deep dives of a particular area of
interest within Natural Language Processing. A favorite of mine from last year
gave a summary of the state-of-the-art in "Word Embeddings" [2], which I would
recommend for anyone interested in the field.

[1] [http://ruder.io/#open](http://ruder.io/#open)

[2] [http://ruder.io/word-embeddings-2017/](http://ruder.io/word-
embeddings-2017/)

------
mark_l_watson
Great summary. I was really surprised when I went to the NACL conference in
2016 and most of the NLP papers were deep learning papers. I don’t believe
that deep learning models will get us anywhere near to AGI but on the other
hand, I think we are just getting started to explore how far deep learning
will get us. If I told myself in 2016 what I would be doing at work in 2018
using RNNs and GANs, I would not have believed it. It seems like surprising
breakthroughs occur on at least a weekly basis now.

------
PaulHoule
Nice article. Bad pop-over.

