
DeepMind's new algorithm adds 'memory' to AI - thekodols
http://www.wired.co.uk/article/deepmind-atari-learning-sequential-memory-ewc
======
choxi
I always assumed there was only one mechanism for memory in the brain, but
it's interesting how multiple Deep Learning architectures have different
mechanisms that could all be described as memory: RNNs and LSTMs can
selectively save states from sequences of data, Neural Turing Machines
basically have a RAM unit to store larger data structures, and now we have
these networks that can remember "skills" across different training programs.

If we draw parallels to the human brain, the mechanisms sort of look like
short term memory, long term memory, and something like "skills memory" (I'm
not sure if the latter has a proper academic term, but it's a common
experience e g. You never forget how to ride a bike).

Maybe one of the reasons why human memory has been such an elusive concept is
because it's actually many different and independent mechanisms instead of
just one.

