
Stanford Unsupervised Deep Learning Tutorial (2014) - espeed
http://deeplearning.stanford.edu/tutorial/
======
hardmaru
I've gone through every exercise from previous version of the stanford
tutorial a few year's ago, when it was called the UFLDL tutorial.

The sparse autoencoder
([http://ufldl.stanford.edu/wiki/index.php/Exercise:Sparse_Aut...](http://ufldl.stanford.edu/wiki/index.php/Exercise:Sparse_Autoencoder))
exercise has been my favourite one, and I think one that is still relevant
today, but became a somewhat ignored concept.

------
jypepin
This is one of my 2017 resolution: learn more about AI/ML/DL. The field is so
big I'm not sure where to start; I've started the Udacity AI class and started
reading the "AI: A mordern approach book" from peter norvig.

Has anyone does that before and has any recommendations on where to start,
good resources etc?

~~~
paulsutter
For Deep Learning, start with MNIST. The 1998 paper[1] describing LeNet goes
into a lot more detail than more recent papers. Also there's an excellent
video from Martin Gorner at Google that describes a range of neural networks
for MNIST[2]. The source code used in his talk is excellent[3].

[1]
[http://vision.stanford.edu/cs598_spring07/papers/Lecun98.pdf](http://vision.stanford.edu/cs598_spring07/papers/Lecun98.pdf)

[2]
[https://www.youtube.com/watch?v=vq2nnJ4g6N0](https://www.youtube.com/watch?v=vq2nnJ4g6N0)

[3] [https://github.com/martin-gorner/tensorflow-mnist-
tutorial](https://github.com/martin-gorner/tensorflow-mnist-tutorial)

~~~
fnl
Starting with DL? That's like learning calculus before geometry... YMMV?

~~~
paulsutter
Convolutional nets for digit recognition are certainly easier to learn than ML
generally, but I wasn't suggesting to start with Deep Learning. I was
suggesting that when studying Deep Learning, to start with MNIST.

It was easier for me to start with the LeCun 1998 paper than to watch all the
theorem-proving in the online courses, but that's just personal preference.

~~~
fnl
Sure, but I take it the original comment wasn't exactly by someone with some
ML background. And getting to grips with log likelihoods, (cross-) entropy,
linear/logistic regression, evaluation metrics, and maybe even some Bayesian
statistics might be rather helpful before jumping on the DL bandwagon.

While there are far too many hardcore statisticians and academics who love
their theorems more than anything, not all classes are that way. I think I'd
have loved it if I could have learned ML from today's MOOCs, instead of those
theorem provers and formula speakers I had to deal with (and pass _real-life_
exams you can't repeat every 8 hrs...)

~~~
paulsutter
I don't have an ML background and I had no problem understanding the LeCun
1998 paper. Naturally, the more ML one knows the better, I'm just encouraging
people to dive in and try without getting intimidated.

~~~
fnl
Anecdotally, one astonishing observation I often make is that "breakthrough"
papers [1] are nearly universally among the most accessible, clear and easy to
follow. From Watson and Crick on DNA in MolBio, to Backpropagation by Hinton
in ML, to Cox' survival model in Statistics, the most significant advances
often tend to be the "easiest" to understand (in hindsight only, naturally).

[1] [http://www.nature.com/news/the-
top-100-papers-1.16224](http://www.nature.com/news/the-top-100-papers-1.16224)

------
imjustsaying
Why did you add the word "Unsupervised" when it wasn't contained in the
original text?

~~~
thomasahle
It is in the <title> of the article.

UFLDL Tutorial = Unsupervised Feature Learning & Deep Learning Tutorial.

~~~
imjustsaying
Whoa you're right, there it is.

------
aaronjg
When is this from? The last paper in the reference is 2010.

~~~
Smerity
I work in the field and I'm with aaronjg - this is ancient in the scheme of
deep learning and very far from modern best practices. I honestly find it
confusing when I see links like this hit the top of the page with such a
strong number of upvotes. There is more modern and better run material now.
Even if this is being referred to historically it should involve a timestamp.

~~~
technics256
Where would you recommend one to find the more modern and better material?
Thanks.

~~~
Smerity
For general introductory material in this style from Stanford, CS231n (fairly
general but specialization in vision) and CS224d (specialization in DL for
NLP) are great. The material for both of these are online for free and the
video lectures (taken down due to legal challenges regarding accessibility)
are available if you look hard enough ;)

If you're particularly after unsupervised deep learning, I'd recommend you do
one or both of the above (or equivalent) and then read relevant recent papers.

[http://cs231n.stanford.edu](http://cs231n.stanford.edu)

[http://cs224d.stanford.edu](http://cs224d.stanford.edu)

------
e19293001
I've been following this book[0] Grokking Deep Learning. This isn't finished
yet but in active development. I like the style of explanation. As of now,
I've learned how to create a very simple neural network, a neural network with
three input vectors and a very simple deep neural network with three input
vectors - 4 hidden size - 1 output. I'm still looking forward for the next
MEAP releases and my goal is to understand image deep learning for me to apply
on my day job regarding image processing.

[0] - [https://www.manning.com/books/grokking-deep-
learning](https://www.manning.com/books/grokking-deep-learning)

~~~
placebo
I've also been following this book and it seems as if progress has stalled.
The last sent chapter is incomplete and half baked, there is no response from
the author in the book forum and errors in previous chapters remain unchanged.

~~~
e19293001
The author says there is a big update and is going on a review process.

[https://twitter.com/iamtrask/status/818114151339950081](https://twitter.com/iamtrask/status/818114151339950081)

~~~
placebo
Good to know, thanks. Guess I'm a bit of a dinosaur for not following social
networks :-)

------
TheAlchemist
Guys,

please stop posting this kind of old stuff. Or at least other guys - please
stop upvoting this.

It's a very old tutorial, there are tons of better ones, in every aspect,
today.

Somebody once smartly observed that the ratio of upvotes / comments is a good
indicator if something is really good. If the ratio is very high it's a red
flag. This article is a perfect example - nobody has anything interesting to
say about it...

Maybe HN admins could take this into account maybe into the sorting algo.

~~~
xchip
What are those better tutorials? I'm interested.

~~~
TheAlchemist
I really liked those ones: \- CS321 from Standford (outstanding material by
Andrej Karpathy): [http://cs231n.github.io/](http://cs231n.github.io/)

\- Hugo Larochelle videos:
[https://www.youtube.com/channel/UCiDouKcxRmAdc5OeZdiRwAg](https://www.youtube.com/channel/UCiDouKcxRmAdc5OeZdiRwAg)

\- Michael Nielsen tutorial:
[http://neuralnetworksanddeeplearning.com/](http://neuralnetworksanddeeplearning.com/)

\- Chris Olah blog: [http://colah.github.io/](http://colah.github.io/)

\- Keras blog: [https://blog.keras.io/](https://blog.keras.io/)

There is also this one (not really a tutorial though): \- Goodfellow, Bengio,
Courville:
[http://www.deeplearningbook.org/](http://www.deeplearningbook.org/)

Also, there were a lot of 'Ask HN' topics - take a look.

------
KerryJones
It's missing a link.

"This tutorial assumes a basic knowledge of machine learning ... go to this
Machine Learning course [no link]"

Am I missing something?

~~~
metafunctor
Perhaps not exactly the right link, but Andrew Ng's Machine Learning course
(also from Stanford) teaches exactly the required things in the first three
weeks: [https://www.coursera.org/learn/machine-
learning](https://www.coursera.org/learn/machine-learning)

~~~
therockhead
It's been a while since I did any serious maths, would I be lost in course
like this ?

~~~
espeed
See this course on linear algebra:

[http://codingthematrix.com/](http://codingthematrix.com/)

[https://cs.brown.edu/video/channels/coding-matrix-
fall-2014/](https://cs.brown.edu/video/channels/coding-matrix-fall-2014/)

And this one on Machine Learning:

[http://work.caltech.edu/telecourse.html](http://work.caltech.edu/telecourse.html)

------
dominotw
link to tutorial
[http://deeplearning.stanford.edu/tutorial/](http://deeplearning.stanford.edu/tutorial/)

