
Deep Learning for Coders – Launching Deep Learning Part 2 - jph00
http://www.fast.ai/2017/07/28/deep-learning-part-two-launch/
======
metafunctor
Part 1 was great.

However, the first lesson took a bit of stamina to go through. Much of it was
introducing basic Unix/AWS/shell/Python things I know intimately and have
strong opinions and deeply set ways about. Shell aliases, how to use AWS, what
Python distribution to run, running Python from some crazy web tool called
notebooks (and not Emacs), etc. felt like I was forced to learn a random
selection of randomly flavored tools for no good reason.

Yes, it's a random selection of tools. The good reason to bear them is that
you'll learn how to implement state of the art deep learning solutions for a
lot of common problems.

So, I ended up viewing the lessons not as "this is how you should do it", but
rather as "here's one way to do it". And it does get much easier after
internalizing the tools in Lesson 1.

Just something to keep in mind when branding this as "deep learning for
coders". Coders have deep opinions about the tools they use :)

~~~
learner101
I come from the opposite side of the spectrum. I know nothing about
UNIX/AWS/Shell and have no intimate or strong opinions on any of the tools. I
am confused by basics such as how to make a special request for P2 access,
since the set up video is out of date. The git cloning recommendations on the
wiki is also confusing to me. I'd rather see a video on how to do the git
cloning or step-by-step instructions (with pictures) on how to do this and
what to expect if it has been properly done.

I felt like some of the material is outdated and I find it not clear how to
get around it. I ended up getting frustrated with the set up videos and git
cloning instructions, so I skipped it in their entirety. I hope the future
iterations of this class spell things out for beginners like me. Otherwise, I
can't even begin the class since I don't know how to request access to P2,
clone git, etc.

I am not uneducated. I come from a pure mathematics background and don't know
anything about this set up business. I can code, know theoretical CS, but when
it comes to setting up the tools (along with the outdated material) I am
utterly lost.

At this point I'm just watching the videos. I can't actually do any of the
coding stuff since I don't have the tools set up, but I like their top-down
approach and am learning a lot despite these obstacles.

~~~
jph00
We (including the rest of the learning community) would love to help you -
could you post on the forums the problems you are having and we'll try to help
sort them out? If you've already posted, please at-mention me there so I don't
miss it.

It's a fine line between helping too much vs too little in lesson 1! For the
next time we run part 1, we're thinking we'll have an optional weekend
workshop to teach the necessary (non deep learning specific) software pieces -
python, numpy, AWS, shell, etc. That would end up being a separate mini-MOOC I
suspect.

Based on the discussions on the forums it seems most students are between
these two extremes, and largely follow the video advice, branching off
sometimes where they need to do some additional research, or are already
familiar with some alternative approach.

 _Edit: actually thinking more about it - probably your best bet is to simply
use[http://crestle.com](http://crestle.com) . All the data, notebooks, and
software is pre-installed, so you can start coding right away._

~~~
learner101
Thanks for the reply. I had not (yet) posted on the forum as I was
methodically searching through the previous replies to ensure my question had
not already been answered. Thanks for crestle recommendation. I will now look
into it. This should simplify my tool setup. If I still have questions around
tool setup after posting, then I will at-mention you. I think the mini-MOOC
(for people like me) is a great idea. I get lost in basic tooling (that many
experienced devs can skip). So, if you guys include a mini-MOOC on tool set
up, then this would benefit people like me tremendously and help us get up to
speed with everyone else.

Will the revamped course in October be offered online (like the current
version is) for public viewing?

~~~
jph00
Yes it'll be online a couple of months after completion, or you can apply to
join remotely as an international fellow.

~~~
learner101
Excellent. Thank you.

------
phunge
Highly recommended! The first course was the first thing I came across that
helped me contextualize the DL field into something that might be relevant for
my work. It's a great way to get your hands dirty.

One point of comparison is Cam Davidson Pilon's Bayesian Methods for Hackers,
they have a similar vibe: practical applied advice from a field that tilts
towards the academic...

~~~
jph00
That's a very flattering comparison! :) Such a great book - here's a link for
those interested in bayesian methods:
[http://nbviewer.jupyter.org/github/CamDavidsonPilon/Probabil...](http://nbviewer.jupyter.org/github/CamDavidsonPilon/Probabilistic-
Programming-and-Bayesian-Methods-for-
Hackers/blob/master/Prologue/Prologue.ipynb)

In fact that book inspired me to create a spreadsheet that implements MCMC in
order to make it easy to understand and visualize - we're planning to start an
"Introduction to Machine Learning" course in a couple of months where I hope
to show off the result of this...

------
ashkat
Thank you so much for this, for me Deep learning Part 1 was a top notch course
that really helped me learn by actually doing things in variety of topics (e.g
competing in Kaggle, creating spreadsheets to understand collaborative
filtering & embeddings, sentiment analysis using CNN and RNN etc). I found the
top down approach to very effective in keeping me motivated as I worked my way
through the course. It took me 6 months of watching(and rewatching) the videos
and working on problems to get comfortable.

I have done a few MOOCs: Andrew Ng's machine learning, Coursera ML
specialisation, edx Analytics Edge and all of them were good learning
experience but fast ai's deep learning part 1 really stood out.

For me, the combination of Deep Learning Book + Fast ai MOOC + CS231n (youtube
videos & assignments) cover almost everything I want to learn about the
subject.

@jph00, I'm half way through neural style transfer and I am loving it.

------
jph00
I somehow forgot to mention in the post - we're teaching a totally updated
part 1 course (keras 2, python 3.6, Tensorflow 1.3, recent deep learning
research results) starting end of October in San Francisco. Detail here:
[https://www.usfca.edu/data-institute/certificates/deep-
learn...](https://www.usfca.edu/data-institute/certificates/deep-learning-
part-one)

I'll go edit the post with this info now - but figured I'd add a comment here
for those that have already read it.

------
colmvp
My feelings on Part 1:

I felt like the setup of the first part was at time a little frustrating,
since I started it during a time when Keras had switched to a newer version
which wasn't compatible some of the utility code that was written. Add this to
the newbie factor to notebooks, and it was a pretty rough first week or so to
setup and get actual learning done. It took me a bit of time to realize
notebooks were more like repeatable trains of thoughts than well-written
production code.

The other thing is that some of the supplementary material was really long and
at times made me feel like, why take this course instead of just going through
a course mentioned in supplementary material (e.g. CS231n wrt CNN's)? I think
I ended up spending hundreds of hours reading/watching/practicing CNN's by
reading papers, watching Karpathy's 231n videos, and doing a couple tutorials
from data scientists who elaborated on a specific problem they were solving. I
guess at times when watching Part 1's videos and doing the notebooks, I didn't
feel like I was 'getting it' as much or as fast as when I was getting the
information from other means.

While the forum discussions can be helpful, it was also wadding through a ton
of unstructured content. And the service they used for the forums hotmapped
the find shortcut to their own built-in search, which was a little annoying. I
don't know a great solution to having more structured data, but perhaps adding
some of questions that were answered to the lesson's Wikipedia. Or maybe
splitting the technical issues from the high level concepts.

Lastly, I think it was either HN or /r/MachineLearning but someone had
suggested a book regarding Machine Learning and hands-on Tensorflow usage
which I picked up, and I felt like my pace of learning really sped up
afterwards. I think part of it was Tensorflow has a lot more written about it
so when you encounter an odd problem, chances are someone else has something
to say about it.

All criticisms aside, I think I'll try going through Part 1 a second time
around prior to going through Part 2.

~~~
drewbuschhorn
I had the opposite experience. I'm basically as non-math as you can get and
still be in the sciences, and I found the classes quite intelligible (on a
fast watch without the notebook, and then a slow watch with the notebook for
each class).

FWIW, I think the supplementary material wasn't strictly necessary from a
'using the libraries' point of view. I'll never contribute to this field, but
I feeling like Jeremy's explanation were conceptually helpful if not rigorous.

For me, the order was: lesson, notebook+lesson, wiki + supplementary material
if something wasn't making sense, and the discussion board if all else failed.
That discussion board is basically useless unless you're taking the class in
real time I think, which has been my experience for all MOOCs.

Different strokes for different folks.

------
DrNuke
The n00best path to data science and machine learning state of the art is now
complete, no excuses! 2015: Andrew Ng's Coursera MOOC; 2016: Kaggle
competitions with xgboost and ensembles; 2017: deep learning code-oriented
courses with fast.ai and GPU hardware for the masses. Thanks, very lucky to
witness and try this.

------
daedalus13
jph00, I found the first course hard to follow because of some broken links
and poorly organized content. One link that was necessary kept taking me to a
password protected page. This is about a month ago.

It would be good if someone could revisit part.1 and make those minor
editorial fixes if they haven't already done so.

I might be being too precious about my time, but I also found the first video
about your teaching philosophy somewhat gratuitous; I wish I hadn't watched
it.

~~~
jph00
The move from platform.ai to files.fast.ai could have been communicated better
- sorry this impacted you. (We tried to highlight everywhere we could, but we
can't change the video itself on YouTube unfortunately.)

We're redoing the whole of part 1 starting in October so this problem will be
fully resolved then. Until then, follow the links on course.fast.ai or the
forums, rather than what you see in the part 1 videos, or just remember to
always replace platform.ai with course.fast.ai.

~~~
jph00
Oh and about the teaching philosophy video - until we posted that we had quite
a few students express confusion about the top-down approach. After posting
it, we've received a lot of positive feedback about it. I understand it's not
helpful or interesting to everyone, but overall it seems to have been a
successful addition for most.

~~~
drieddust
I am one of those guys who likes to understand everything before beginning to
do anything meaningful. Needless to say I mostly fail.

Your lecture on top down course philosophy actually helped me change my
perspective. It's a welcome positive change so thank you.

------
natch
Afraid I may have missed the window on the chance to provide feedback to jph00
via this channel, but here goes.

Am watching Part 1 now and only two sessions in, but there are some tweaks I
would love to see. First the positive: I really appreciate the approach of
hands-on and teaching theory only as it's needed and in conjunction with
applied work.

Would love to see a tiny bit of time spent on setting up tools for people who
already have good Nvidia GPU systems. My Ubuntu system has python (2.7) and
python 3.5 both installed, but no Anaconda... I don't know if I'm going to
totally screw up my system if I install Anaconda over those working existing
python installations, for example.

It would be great to hear the questions. I can barely hear a faint voice in
the background as Rachel reads the questions (presumably from online) but it
seems like it would be a very easy tweak to have her closer to a microphone.
Maybe this happens in later sessions and I just haven't gotten to them yet.

It would be great if so many things weren't abbreviated in the code variable
and function names. Examples: nb for notebook, t for ?, a for array(?), U, s,
and Vh for ?, ims (?), interp (interpretation or interpreter or
interpolation?), sp, v, r, f, k, trn (train or turn or something else?), pred
(predicate or prediction?), vec_numba (?)... the list goes on. Yes if I knew
the field these might be obvious but for some of them I'm still learning. "np"
I understand since that's standard practice and you explained it. It would be
really really easy to just spell out words in the code, as well as being a
good practice in general imho, and, since you are trying to teach stuff, it
would seem appropriate.

Those nitpicks aside I'm really stoked about the course and really appreciate
everything you have been putting into it!

~~~
mkl
> Would love to see a tiny bit of time spent on setting up tools for people
> who already have good Nvidia GPU systems. My Ubuntu system has python (2.7)
> and python 3.5 both installed, but no Anaconda... I don't know if I'm going
> to totally screw up my system if I install Anaconda over those working
> existing python installations, for example.

Anaconda lives in its own folder (usually in $HOME). You can't screw anything
up by installing it, and in fact you can hardly tell it's there. You need to
set your path to actually use Anaconda's programs, and you shouldn't do that
in .bashrc, but just in the shells where you are actively using it, with
something like:

    
    
      export PATH=/home/<you>/Development/Tools/Anaconda3/bin:$PATH

------
alexcnwy
Honestly can't recommend this course highly enough.

It's definitely not perfect - the notebooks are not commented and the material
does tend to jump around a bit - but what it does do, it does extremely well.

This course will teach you how to actually build deep learning systems and
build the kinds of things you read about PhDs doing...

------
BrianMingus
Latently (SUS17) also provides a more self-directed path to learning deep
learning focused exclusively on implementing research papers and conducting
original research:
[https://github.com/Latently/DeepLearningCertificate](https://github.com/Latently/DeepLearningCertificate)

~~~
pzh
That seems interesting, but there are so many papers and little indication of
which ones are more important (or which ones to implement first). I realize
this is for advanced learners, but some guidelines, or at least a section
pointing to survey papers would be really helpful as a starting point.

~~~
BrianMingus
We have a bibliography - it is not yet organized very well but we are working
on that:
[https://paperpile.com/shared/UhfbVO](https://paperpile.com/shared/UhfbVO)

------
edshiro
This is exciting! I went through Part 1 a few weeks ago (probably have to
cover embeddings and RNNs again...) and felt it was totally worth it.

Part 2 seems equally strong in content (if not stronger). It's a beautiful
time to be n00b in deep learning & AI, and learn via material like these. No
excuses. Knowledge is power.

------
mcintyre1994
I've been looking to do part 1, so this is really cool - looking forward to
this too! On
[http://course.fast.ai/part2.html](http://course.fast.ai/part2.html) the
thumbnail for lesson 8 has specs for building a PC, with advice to use
pcpartpicker. For part 1 I liked the idea of using AWS and only paying for a
few hours, does part 2 have a hard requirement of a >$100s investment in
hardware?

~~~
jmeyers44
There is no requirement, but if you are spending a meaningful amount of money
per month on AWS (over $100) and plan on working on DL projects for the next
year or two it might make sense to make the initial up front investment.

~~~
alexcnwy
You can definitely get by just using AWS and the flexibility is great but it
can get expensive.

The Tiramisu model from lesson 14 takes like 25 hours to train and at $0.90
per hour, it adds up pretty fast...

------
Omnipresent
For folks who've gone through part 1 and 2. Do you think the course provides
enough material to tackle tasks like deep learning ocr [1] or custom object
detection in images?

[1]: [https://blogs.dropbox.com/tech/2017/04/creating-a-modern-
ocr...](https://blogs.dropbox.com/tech/2017/04/creating-a-modern-ocr-pipeline-
using-computer-vision-and-deep-learning/)

~~~
jph00
Yes, you should absolutely be able to tackle those tasks after doing the
course.

------
throwaway12017
What is the goal of these trainings? To get a taste so you understand the
conversation? There is a lot more to data science than neural networks, and
I'm skeptical that teaching one family of models will create a set of
implementers that don't compare and contrast solutions.

~~~
jacquesm
Practical applications rather than theoretical knowledge. In other words, if
you're a programmer to put more tools in your toolbox.

~~~
throwaway12017
Sound fair. I'll have to watch the videos to get a better opinion.

------
bitL
Wonderful! You picked a really nice selection! Can't wait to do them all!
Thank you!

------
cs702
Based on the feedback I'm reading here about Part 1, I'm going to start
recommending these courses to non-academic friends who have expressed interest
in learning more about Deep Learning.

THANK YOU for doing this.

------
cakedoggie
They don't even have a link to part 1 at the start of the article??

~~~
jph00
The very 1st link of the article is a link to part 1.

I'll also link the text 'part 1' in that paragraph to make it more clear.

~~~
cakedoggie
Ah ok, confusingly this is the link to part 1:
[http://course.fast.ai/](http://course.fast.ai/)

This seems a bit odd.

------
mikden
Looking forward to part 2 Jeremy! Part 1 was nothing short of excellent

------
Tsagadai
jph00, I would just like to thank you for the first course and now the second
course. I've thoroughly enjoyed both and they have taught me a lot.

