
Torch7 – Scientific computing for LuaJIT - ot
http://torch.ch/
======
srean
Upvoted the story because I am keen to see some discussion around it. I have
kept an eye on Torch7 but there have been a few things that make me a little
wary. A prominent one is the sheer number of unreplied to posts on their
mailing list. Numpy/Scipy and Julia lists seem much more welcoming and prompt.
Even if people there cannot help they do respond.

Now if we ignore the social side of it, I would really like to know why
Torch7/Lua and not say Numpy/Scipy/Theano or Julia for that matter. To me
scratching an itch is a fine enough excuse, just wanted to know if there are
any compelling technical advantages to do it this way.

Some aspects are quite promising, one is to JIT compile away most of the FFI
glue, another is interaction with Numpy/Scipy code. There have been a few
projects to allow seamless back and forth between Lua and Python, so if that
can be done with Numpy and Cython that is indeed quite an exciting
possibility.

~~~
ihnorton
> Some aspects are quite promising, one is to JIT compile away most of the FFI
> glue, another is interaction with Numpy/Scipy code.

As a note, both of these things are available in Julia. The built-in `ccall`
allows FFI calls without glue code and at the same cost as a shared library
function call from C (no ctypes/libffi overhead). The PyCall package provides
(damn-near magical) interoperation with any Python library.

------
elyase
There was a recent Reddit AMA with Yann LeCun's [1] where he comments on
Torch7[slightly edited]:

 _Torch is a numerical /scientific computing extension of LuaJIT with an
ML/neural net library on top. Torch7 is what is being used for deep learning
R&D at NYU, at Facebook AI Research, at Deep Mind, and at Google Brain.

(At Facebook) We are using Torch7 for many projects (as does Deep Mind and
several groups at Google) and will be contributing to the public version. We
are using Torch for most of our research projects (and some of our development
projects) at Facebook. Deep Mind is also using Torch in a big way (largely
because my former student and Torch-co-maintainer Koray Kavukcuoglu sold them
on it). Since the Deep Mind acquisition, folks in the Google Brain group in
Mountain View have also started to use it.

Facebook, NYU, and Google/Deep Mind all have custom CUDA back-ends for
fast/parallel convolutional network training. Some of this code is not (yet)
part of the public distribution.

The huge advantage of LuaJIT over Python is that it way, way faster, leaner,
simpler, and that interfacing C/C++/CUDA code to it is incredibly easy and
fast.

Torch is maintained by Ronan Collobert (IDIAP), Koray Kavukcuoglu (Deep Mind.
former s=PhD student of mine) and Clément Farabet (running his own startup.
Also a former PhD student of mine). We have used Torch as the main research
platform in my NYU lab for quite a while.

You could say that Torch is the direct heir of Lush, though the maintainers
are different. Lush was mostly maintained by Leon Bottou and me. Ralf
Juengling took over the development of Lush2 a few years ago. _

[1] [http://fastml.com/yann-lecuns-answers-from-the-reddit-
ama/](http://fastml.com/yann-lecuns-answers-from-the-reddit-ama/)

------
ot
I came across about Torch in this FB comment by Yann LeCun:

    
    
        I have moved to Torch7. My NYU lab uses Torch7, Facebook AI
        Research uses Torch7, DeepMind and other folks at Google use
        Torch7.
    

Apparently it is used in many deep learning labs. I think that in Toronto they
mostly used Matlab and Python, does anybody know if this is still true?

[https://www.facebook.com/yann.lecun/posts/10152077631217143?...](https://www.facebook.com/yann.lecun/posts/10152077631217143?comment_id=10152089275552143&offset=0&total_comments=6)

