

Kaggle Chief Data Scientist, Word2Vec “is a really, really, really big deal” - biomimic
https://www.kaggle.com/c/word2vec-nlp-tutorial/forums/t/12349/word2vec-is-based-on-an-approach-from-lawrence-berkeley-national-lab/63262#post63262

======
PaulHoule
Similar things have been around for a long time.

For instance, you can train something like LDA on the contexts that words
appear in. Also I've seen a system that uses a neural net to generate a
16,000-bit vector and very interesting things happen when you use AND and OR.

~~~
biomimic
Agreed. I once used a rich vector space model to answer natural language
questions related to geography with a similar approach.

------
sp332
Some reason you're posting this now? It was big news back in 2013.

~~~
biomimic
Word2Vec is part of a Kaggle competition to classify sentiment of movie
reviews from rotten tomatoes. There seems to be a better method than word2vec
from berkeley lab. I found that interesting considering the impact Jeremy
Howard said word2vec could have.

