

Neural Transformation Machine: Sequence-To-Sequence Learning - groar
http://arxiv.org/abs/1506.06442

======
deepnet
The OP paper proposes a different architectural approach to the translation
task of the 2014 NIPS paper by Ilya Sutskever, Oriol Vinyals & Quoc Le,
_Sequence to Sequence Learning with Neural Networks_

[http://papers.nips.cc/paper/5346-sequence-to-sequence-
learni...](http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-
with-neural-networks)

which "uses a multilayered Long Short-Term Memory (LSTM) to map the input
sequence to a vector of a fixed dimensionality, and then another deep LSTM to
decode the target sequence from the vector." The task was English to French.

Meng et al.(2015)(OP) translate a Chinese sequence to English, using a network
based on Neural Turing Machines(NTM) which uses LTSM units, they name this
novel architecture Neural Transformation Machine (NTRam).

The Neural Turing Machine(NTM) was proposed by Deepmind's Alex Graves, Greg
Wayne & Ivo Danihelka, it couples a neural net and LTSM memory to produce a
differentiable, thus trainable analogy to a Turing Machine or Von Neumann
architecture - to perfom copying, sorting and associative recall. An
exploration of whether Neural Networks can be put to basic computing
functions.

 _Neural Turing Machines(2014)_ by Alex Graves, Greg Wayne, Ivo Danihelka

[http://arxiv.org/abs/1410.5401v2](http://arxiv.org/abs/1410.5401v2)

