
Getting Started with Deep Learning: A Review of Available Tools - mollison
http://www.svds.com/getting-started-deep-learning/
======
vonnik
This list is woefully incomplete. It should be titled "A Review of Some
Tools". Those it doesn't include, or does not give proper mention to in its
top table, are: Chainer, DyNet, Paddle and Deeplearning4j.[0]

Even Keras, the third most popular DL library, is given short shrift, even
though it is easier to use than TensorFlow, and can handle Theano, TF, CNTK
and Deeplearning4j as backends.

All of the missing libs have significant advantages over TensorFlow, which
appears to be winning chiefly on Github stars. Notably speed and integrations.

[0] [https://deeplearning4j.org/](https://deeplearning4j.org/)

[http://chainer.org/](http://chainer.org/)

[https://github.com/clab/dynet](https://github.com/clab/dynet)

[https://github.com/PaddlePaddle/Paddle](https://github.com/PaddlePaddle/Paddle)

~~~
curuinor
Chainer is about 50k sloc of very idiomatic Python and a little dealie for
emulating numpy in GPU and a bit of Cython.

TensorFlow is about 700k sloc of C++, Python, some Golang for some godawful
reason, with an enormous annoying build dealie.

That was my own decision calculus, anyhow. Speed of development and complexity
still matter in machine learning land, you know.

~~~
curuinor
One more objectionable fact about TF: There's a separate better documentation
repo for Google insiders only. That's not kosher.

~~~
dontreact
How do you think it's better? I use both the internal and external and the
only difference is that there are more Google-infra specific things
internally. The external one is prettier :)

~~~
curuinor
It is definitely the case that a fair few of those 700k sloc are devoted to
poking at and wibbling with Google infra specific things. Which is of
basically negative value if you're some rando using it.

There are also a fair few non-infra specific bits to the internal
documentation.

------
kriro
The best resource (imo) is "Practical Deep Learning". As the title suggests it
is...well practical. They specifically say that a lot of Deep Learning seems
to be made complex on purpose and that it's important to get a good overview
and start doing stuff immediately. They use old Keggle competitions as a
benchmark. After lesson 1 you can already recreate the infamous dog/cat
classifier and get pretty amazing accuracy (>90% iirc). It's all free and
online, there's also a paid on premise course (in San Francisco iirc). They
use Keras (so one level of abstraction higher than the stuff in the post) and
make heavy use of Python Notebooks. I also love the approach of using AWS p2
instances for everything...very nice to pay as you go for the GPU power (I
wasn't even aware these types of instances existed before watching the
videos). The second course isn't online yet but the claim is that you'll be at
the bleeding edge after that one (i.e. compete with state of the art papers).
I fully believe it.

[http://course.fast.ai/](http://course.fast.ai/)

~~~
sprobertson
Notebooks are perfect for presenting these kind of project based lessons
(except that GitHub doesn't render them on mobile). I'm just starting a
similar series, entirely in Notebooks, doing natural language tasks with
neural networks in PyTorch:

[https://github.com/spro/practical-pytorch](https://github.com/spro/practical-
pytorch)

------
tom_pulo
I appreciate the analysis but I don't think the title of this should be
Getting Started with Deep Learning.

If you are _actually_ looking to get started with deep learning, you should go
elsewhere. This is a review of frameworks and tools people use for deep
learning

------
abc1984
Why is this titled "Getting started with ..." when it is just an overview of a
selected few frameworks? Plus this coming from an R&D department of a company
is highly disappointing.

------
physicsyogi
There's an error in the article about PyTorch:

"For instance, Caffe (C++) and Torch (Lua) have Python bindings for its
codebase (with PyTorch being released in January 2017), but we would recommend
that you are proficient with C++ or Lua respectively if you would like to use
those technologies."

You don't need to know any Lua to use PyTorch. That's the point. Lua is for
Torch.

Edit: The author may be confusing the PyTorch released by the Torch developers
with the earlier pytorch project that did wrap Torch.

------
alok-g
I would like to see a column on platform independence (OS support) in a table
like this.

------
ge96
I really want to get into this eventually. My own AI I know, I'm not a
brilliant mathematician or anything. I just have that obsession (I'm lonely)
hahaha.

I'm looking to build these wall-mounted raspberry-pi servers with cute little
USB-rubber-ducky antennas. But I don't know what kind of hardware you need (or
cloud-base it) to run some of these. My current thought/approach is something
that's always on, analyzing stuff (telemetry) specific web traffic, my own
thoughts (analyze posts to a journal for example).

I don't know... some point get into it busy at the moment.

Like I'd like to hire some vocalists (girls) and have them recite sentences,
deconstruct it (copy it essentially) to be able to build full on-the-fly
sentences without those word-break-word types... ahhhh. Not a programmer
though at this point eg. C#,C++,Java,Python though I use PHP for scripting.
Ahhh. Yes I like the movie Her (2013) a lot, though I only watch the intro
part primarily where he unboxes OS 1.

Thanks for this link.

