

Neural Networks for Machine Learning - ameasure
https://www.coursera.org/course/neuralnets

======
lathamcity
I'm in the middle of the machine learning coursera course, and registered for
this one as well due to interest in the material.

My one complaint is that the programming assignments weren't interesting at
all. The results were interesting, but the setups were mostly given to us, and
we just had to code an algorithm that was in our notes. For someone who
understands the basics of linear algebra and programming, it was just a syntax
challenge, and that got irritating after a bit so I stopped doing them.

I won't get the certificate for completing the course, but I have a few extra
hours of free time each week to add this second course, so I'm happy. I doubt
that the actual homework that Stanford students taking this course get is so
easy and repetitive, though, and I'm positive they wouldn't complain about not
getting to retake quizzes after getting poor grades.

Not to knock the course. I've learned a lot and the professor (Andrew Ng) does
a good job.

~~~
FaceKicker
> My one complaint is that the programming assignments weren't interesting at
> all. The results were interesting, but the setups were mostly given to us,
> and we just had to code an algorithm that was in our notes. For someone who
> understands the basics of linear algebra and programming, it was just a
> syntax challenge, and that got irritating after a bit so I stopped doing
> them.

I agree with this. The programming assignments I've done so far in the Machine
Learning class are usually 5-7 matlab functions, many of which are about 2
lines of code (the longer ones might be ~10 lines of code). If you've ever
done matlab/octave programming the assignments will take about 20-30 minutes
and be completely unenlightening as you're literally just translating
mathematical notation into matlab (which is, by design, already a lot like
mathematical notation anyway). They provide entirely _way_ too much skeleton
code to learn anything from if you're not actively trying to learn. If I
weren't already mostly familiar with most of the material presented in the
class, I imagine I would never retain knowledge of how the machine learning
"pipeline" worked or have any high-level understanding of the algorithms,
because the assignments just require you to implement the mathematical pieces
of each step, without ever asking you to, for example, actually call any
optimization routines, or put the pipeline together.

The problem, I think, is that it would just be too difficult to do automatic
grading in a way that is reasonably possible to pass if they don't turn most
of the work into skeleton code. Since the automatic grading needs nearly
exactly matching results, one minor implementation difference in a perfectly
good implementation of the algorithm itself (e.g., picking a single parameter
incorrectly, picking the optimization termination conditions incorrectly,
choosing a different train/dev split, etc.) would make the entire solution
completely wrong.

~~~
waterlesscloud
I'm doing the Computational Finance class via Coursera at the moment, and I've
done a number of other courses previously.

I agree the programming assignments in the Finance class tend to be too
simple. Most of the code is literally handed to you, you just have to
understand it well enough to change it. I also understand that even that can
be a major challenge if you don't have the background for it.

But I'm choosing to see the class itself as a starting point. It's a framework
for my own explorations into the topics. I can do the minimum and get the
minimum out of it. Or I can use what's provided as a base and go further.

The Coursera Algorithms class, for example. Writing code that got the answer
was relatively easy, so once that step was done it became about optimizing the
code for my own learning benefit.

It's like _any_ educational process, you get our what you put in.

~~~
FaceKicker
Right, you _can_ get more out of the assignments if you try, but to me the
purpose of assignments (versus passive learning - lectures, reading, etc.) is
to _force_ your brain to synthesize rather than just comprehend. The ideal
assignment, then, is one that forces you to synthesize as many concepts it
intends to teach as possible.

Just like you could go back and implement for yourself the skeleton code they
handed you, you could also go out and implement everything in the lectures
without any assignments at all. It's just that, like you said, the assignments
provide a useful starting point. And I'm only saying they could be even _more_
useful by requiring you to implement more of the complete pipeline.

The fact that an incredibly self-motivated person could learn everything there
is to know about machine learning with the course as a starting point doesn't
mean that it's bad to make the course more useful for a somewhat lazier or
less interested person.

------
ameasure
Hinton is a huge figure in the neural network literature and an important
researcher in deep learning. After going through the first week of lectures, I
can say he's also an excellent teacher.

The syllabus, draft though it is, indicates the second half of the class will
focus on deep learning, a field of machine learning that has demonstrated huge
potential.

------
jimbokun
Just browsing through the Coursera Computer Science listings, it looks like
they are rapidly approaching the point where you could put together a CS
curriculum superior to what you could get at any single school. The people
they have teaching a lot of these topics are some of the best in the world in
their field. The Micahel Collins NLP course looks really thorough and up to
date, for example I took a similar course a few years ago, and I remember
reading papers written by him.

As has been said by many already, of course, the remaining nuts to crack are
high quality interaction with other students, professors, and TAs; and
accreditation.

But the dis-intermediation of large universities may be nearer than we think.

~~~
misiti3780
the only real problem with coursera is everyone is posting their solutions to
github, so its gonna be impossible for them to prevent cheating. i agree with
you though, that the flexibility it is offering is amazing

~~~
ChuckMcM
There is an interesting practical question here. Why cheat?

If you are taking a class voluntarily over the Internet, what benefit would be
gained by cheating? I presume that a large fraction of people who are doing
volunteer coursework are doing it to learn, not to keep a GPA up for some
other reason (sports eligibility, scholarship requirements, parental
expectations, Etc.) so looking at other solutions on Github might actually
enhance the experience for you if you look at other solutions. If you find a
way to do it better than the other solutions that could be a goal in itself.

This is one of those things I find most intriguing about 'free' classes on the
Internet, the value equation is shifted around.

~~~
salman89
It depends on the purpose of your education. In an ideal world, it would be
just to learn, but I think employers at some level look at grades/school as a
qualification process.

~~~
ChuckMcM
I see where you are coming from but were I interviewing you I would never even
think to wonder about a self reported grade in a volunteer class. If the topic
was important to the position I'd ask you to talk about it and tell me what
you learned. I would hope I could spot you trying to feed me a line.

At the end of the day, as an employer, I am looking for 'learners' not
'cheaters.' If it turns out that an employee's personality/choices lean toward
the cheating side I try to manage them out of the organization as smoothly as
I can.

~~~
waterlesscloud
If it were a traditional class at a traditional school you'd just assume they
actually learned it?

~~~
ChuckMcM
Hmm, that is a fair question. I think I would give more weight to a class if
they took it when they didn't have to, rather than having taken is a part of a
requirement for a degree.

------
misiti3780
people are already complaining that you can only take the quizes once ... he
had to send out an email today to everyone saying:

"Many of you are unhappy with only being allowed to attempt a quiz once.
Starting in week two, we have therefore decided to make up twice as many
questions and to allow you to do each quiz twice if you want to. The second
time you try it the questions will all be different. Your score will be the
maximum of your two scores. For week one, the quizzes will remain as they are
now.

Many of you would like the names of the videos to be more informative. We will
change the names to indicate the content and the duration.

Some of you thought that some of the quiz questions were too vague. We will
try to make future questions less vague.

Some of you are unhappy that we do not have the resources to support Python
for the programming assignments. We sympathize with you and would do it if we
could. You are still welcome to use Python (or any other language) if you can
port the octave starter code to your preferred language. We have no objection
to people sharing the ported versions of the starter code (but only the
starter code!). However, if you get starter code in another language from
someone else, you are responsible for making sure it does not contain bugs."

I thought that was pretty funny!

~~~
notimetorelax
Yeap we got spoiled with earlier classes: Algorithms by Tim Roughgarden,
Machine Learning by Andrew Ng, and many more. We probably need to follow a
class on gratitude.

Oh well, to be fair I would donate quite a lot for each course that I enjoyed.

~~~
jberryman
Actually all the entitled bitching and moaning on the ML class forum was by
far the biggest turnoff of the whole experience for me. I was much happier
after ignoring it and my "classmates" entirely.

------
emcl
The only course that is not significantly diluted is Koller's PGM. All others
have been dumbed down to a degree where they provide no challenge to the
courseree at all.

~~~
notimetorelax
It is not such a huge problem when you take several courses at once. Sadly
they run them only twice a year, each time I try to follow as many as
possible. I cannot follow PGM because it requires too much of my time, I'd
have to abandon 2 or 3 other courses. YMMV.

~~~
tomku
I'm looking at the same problem at the moment. PGM sounds really interesting,
but I think that the time investment just isn't going to be workable for me
unless I drop several of my other classes. My current plan is to watch the PGM
videos and try to keep up with the programming assignments as long as I can,
but if it comes down to a choice of one or the other, PGM will be the one to
go.

As far as "dumbing down", I've found that the Coursera classes that I've taken
(Compilers, Automata Theory, Algorithms 1, SaaS and Machine Learning) have
varied in difficulty quite widely. Compilers and Automata were both
challenging and enjoyable, Algorithms 1 was about what I'd expect from a
freshman/sophomore algorithms class and SaaS and Machine Learning were easy
enough that they should be approachable to anyone with basic programming
experience.

I don't feel that the difficulty in the classes that I've taken had any
particular correlation with teaching effectiveness. I found Andrew Ng's ML
class to be simple, but still interesting and informative - you come out of it
with enough of a basic understanding to implement simple ML techniques as well
as a place to start if you wish to learn more. I think that while a theory-
centric class would be a nice thing to have, he's done an amazing job of
making a class that can appeal to a wide range of potential students and
introduce them to a field that's usually very difficult to approach.

------
azakai
> Neural Networks are gradually taking over from simpler Machine Learning
> methods

And haven't SVMs and such gradually taken over from Neural Networks?

~~~
rm999
I am not an expert in SVMs, but I consider myself fairly experienced in
machine learning. In my professional experience the answer to your question is
'not quite'. SVMs have solved some problems very well, but I've had issues
with them:

1\. They are only for classification, not every problem is classification. The
other big category is regression, for example predicting the sale price of a
home rather than predicting a binary "will it sell"

2\. They don't have a natural probabilistic interpretation for classification.
Neural networks for classification (with a logistic activation function) are
trained to predict a probability, not make a simple binary decision. In
practice this probability is usually very useful, although I believe SVMs have
been modified to give some kind of probability.

3\. I have had a tough time getting them to run quickly. Linear kernel SVMs
are fast, but aren't powerful. More complex kernels are more powerful but can
be very slow on moderately large datasets.

~~~
achompas
SVMs are very much used for regression as well:

<http://scikit-learn.org/stable/modules/svm.html#regression>

Note: the scikit-learn implementation of SVMs is based on libsvm:

<http://www.csie.ntu.edu.tw/~cjlin/libsvm/>

~~~
rm999
Interesting, a quick glance at a paper on SVRs indicate they kind of work in
the opposite manner of a SVM - in an SVM you try to maximize the number of
points far away from the separator (taking into account class), whereas in
regression you are trying to minimize this.

Do you have much background using them? I'm curious how they perform on real-
world tasks.

~~~
achompas
Yeah, there's the SVR "pipe" concept, where you attempt to fit the margin s.t.
points are close to it. It's a great alternate use of SVM's obj. function
optimization.

I haven't really used SVRs aside from some exploratory work, so I can't speak
too much about them. But I know they exist!

------
isakovic
I took Professor Hinton's course on Neural Networks as an undergrad. This man
is the most intelligent person I have ever met. He is one of the giants.

------
tocomment
Did it already start? Is it too late to start?

Also I took a nn class in college so do you think I would get much more out of
this?

~~~
ameasure
It just started on Monday, there's plenty of time to join in.

There have been some huge developments in neural networks in the last few
years, particularly with respect to deep learning. If you missed out on that
you might want to try this class. Hinton has been involved in many of these
advances.

The second half of the course appears to focus on deep learning topics so you
might want to start there if you already know the basics.

~~~
misiti3780
you cant start mid-way though ... right ?

~~~
ameasure
You'll have to wait until those lectures are made available, but you don't
have to complete the previous work to see the lectures.

------
rubashov
I tried to do a couple coursera courses and found the video lectures highly
inefficient; very needlessly time consuming, even watching them sped up. All I
really want is a glorified text book with quiz grading and a final.

~~~
vitno
this. No offense to these professors, but what are they presenting in their
video lectures that I can't garner from their writing?

~~~
saraid216
Humanity. Which, believe it or not, makes a huge difference in learning
subjects.

