
Dynamic word embedding for evolving semantic discovery - godelmachine
https://blog.acolyer.org/2018/02/22/dynamic-word-embeddings-for-evolving-semantic-discovery/
======
kbob
Can someone explain what a "norm" is in this context?

> The learned word vector norms across times grow with word frequency, and can
> be viewed as a time series for detecting trending concepts with more
> robustness than word frequency.

> > Generally, comparing to frequencies which are more sporadic and noisy, we
> note that the norm of our embeddings encourages smoothness and normalization
> while being indicative of the periods when the corresponding words were
> making news rounds.

Thanks.

