
Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes - visionscaper
http://xxx.lanl.gov/abs/1610.09027
======
visarga
Sparse Access Memory is like a differentiable hash table. I'm wondering how
many new differentiable data structures could be invented in order to augment
deep neural nets?

