
Hopfield Networks Is All You Need - MAXPOOL
https://arxiv.org/abs/2008.02217
======
MAXPOOL
> We show that the transformer attention mechanism is the update rule of a
> modern Hopfield network with continuous states. This new Hopfield network
> can store exponentially (with the dimension) many patterns, converges with
> one update, and has exponentially small retrieval errors. The number of
> stored patterns is traded off against convergence speed and retrieval error.

[https://github.com/ml-jku/hopfield-layers](https://github.com/ml-
jku/hopfield-layers)

[https://old.reddit.com/r/MachineLearning/comments/i4ko0u/r_h...](https://old.reddit.com/r/MachineLearning/comments/i4ko0u/r_hopfield_networks_is_all_you_need/)

