
Ask HN: What Deep Learning Libraries are you using? - larryfreeman
There have been a large number of Deep Learning Libraries that have been released recently.<p>TensorFlow is getting a lot of buzz.  Theano and Torch are often spoken about.  Berkeley has Caffe and Microsoft has released CNTK.  Nvidia has cuDNN and Keras looks very nice.  What have you used?  What do you like about the library?  What are your pain points?
======
dennybritz
These libraries are not mutually exclusive, they operate on different levels
of abstraction. For example, you can use Keras with either Theano or
Tensorflow. cuDNN is an interface to the hardware and is used internally by
most libraries.

Breaking it down. High-Level Frameworks:

\- Caffe is very high-level and almost only used for Convolutional Neural
Networks. It doesn't have good support for RNNs or anything else. It has a
very good collection of pre-trained models (model zoo).

\- Keras is a "wrapper" around Tensorflow or Theano and includes many higher-
level abstractions like various types of layers, optimizers, etc. It's
typically what I recommend to anyone who wants to get something up and running
quickly and doesn't necessarily want to develop novel models.

On the next lower level are Theano and Tensorflow. They are pretty much
competing with each other and have a very similar computational model
(computational graphs). People/Companies seem to be moving towards Tensorflow,
so that what I'd recommend using at this point. Tensorflow recently added
several higher-level abstractions (like TF Learn and contrib modules) that are
quite similar to those in Keras.

cuDNN is a library for GPU acceleration. It's _used_ by most of these
libraries under the hood to speed up computation. You certainly can use cuDNN
directly, but unless you're doing low-level research it's probably not
necessary.

------
malux85
I used to use Keras on top of Theano, and did performance testing of Keras on
TensorFlow in parallel every TensorFlow release.

The last release (0.8) of TensorFlow is much faster to my use case (model
compilation twice as fast, execution about 10% faster) and made the switch to
TensorFlow across the whole cluster about a week ago. Has been performing well
and the transition was pretty seamless thanks to Keras.

------
fenier
I advise to not think of yourself like a dummy. It puts you in the wrong
mindset.

