
Patrick Winston Explains Deep Learning - rck
http://rodneybrooks.com/patrick-winston-explains-deep-learning/
======
jacquesm
Just watched the first video, it's a very nice intro to the mathematics behind
neural networks, personally I like Jeremy & Rachel Howards video course [1] a
lot better because it is much more geared towards practical applications but
if you're into the theory end of things or if you simply want to understand
better what is going on behind the scenes this video is really quite good and
well worth your time.

[1] [http://course.fast.ai/](http://course.fast.ai/)

~~~
bfe
Thanks, this looks great.

------
JamilD
Patrick Winston is a great lecturer -- and it's worth it to do the whole 6.034
course if you have the time. A lot of the topics he goes over are considered
"out of style" now, but sometimes old-school AI algorithms find their use in
new places [0]…

[0] [https://blog.openai.com/evolution-
strategies/](https://blog.openai.com/evolution-strategies/)

~~~
flor1s
It's funny because in the lecture itself he discussed how the idea of neural
networks was once on the chopping board of the course, but they decided to
keep it in just to avoid people reinventing the wheel.

It's amazing how a lot of the basics were written about in textbooks from the
80's and even earlier, but how some recent works by people such as George
Hinton have reinvigorated the field and actually enabled practical
applications.

~~~
profosaur
I think Hinton himself was there in the 80s pioneering neural networks, and he
stuck with it this whole time. He was one of the first researchers to
demonstrate the use of backpropagation to train neural networks.

------
jacobkg
It's amazing how far we have come in the past decade. I took 6.034 with
Professor Winston in Fall 2006 (and agree completely that he is an amazing
teacher). I remember there being only one lecture covering Neural Networks on
which was remarked that they were interesting in theory but disappointing in
practice.

~~~
jacquesm
There have been four such revolutions so far and I strongly feel this will be
the fifth.

    
    
      - transistors
      - (micro)computers
      - (smart)mobile phones
      - the web
    

And now:

    
    
      - practical pattern recognition using neural nets
        aka deep learning

~~~
spyhi
Agreed, though I draw the parallel elsewhere. The industrial revolution was
about automating labor, though really it just scaled up repeatable processes.
Likewise, computer programs don't really automate mental labor, they just
scale up repeatable processes once the mental labor of figuring out the
specification is done. Deep learning, on the other hand, promises the
automation of actual mental labor--creating new information from other,
unrelated information. If you look at it that way, its role and future seems
pretty obvious.

------
dmh2000
incredibly good teacher. In the same class as robert sedgewick at princeton.

------
pratap103
One of the prerequisites is differential calculus, please suggest some good
resources to learn from!

~~~
chestervonwinch
You might try Strang's book [1]. You probably only need to cover chapters:
1,2,3,4,6,11,13.

[1]:
[https://ocw.mit.edu/ans7870/resources/Strang/Edited/Calculus...](https://ocw.mit.edu/ans7870/resources/Strang/Edited/Calculus/Calculus.pdf)

~~~
pratap103
Thanks a lot! Been a while since I took calculus

------
agjacobson
Great post.

