

Stanford Machine Learning course - csantini
http://theorymatters.wordpress.com/2009/02/02/stanford-machine-learning-course/

======
physcab
I'm in a Machine Learning class right now and the math is absurd. Our
professor goes over general theory in class and references some practical
applications. For homework we have to prove the theories and then create
MATLAB programs that create the figures in our textbook (Bishop). It's pretty
dense and time consuming.

However, I come from a Physics background, so the math is just different, and
may in fact be easier. However, I've been referencing my Mathematical Methods
for Physicists book (Boas) pretty often to brush up on some linear algebra
techniques.

Lastly, its been said quite a bit in these forums, but if you are just
interested in implementing this stuff, definitely check out Collective
Intelligence by Oreilly. I use that book in tandem to bring some of the high-
level concepts back down to earth. :)

Good luck

------
mlinsey
You can see example final projects for this course here...
<http://www.stanford.edu/class/cs229/projects2007.html>

The midterm for CS 229 was probably the hardest exam I took in college.

------
FraaJad
I read some of the ML lectures notes from OCW some time back. It was very math
heavy. Is that an MIT thing or is that how ML is taught in all universities?

~~~
jerf
I can not conceive of how one would have a "non-math heavy" machine learning
course. Machine learning is basically a branch of applied statistics, at least
in its current form. None of the algorithms are "fire and forget", such that
you can use them effectively without deeply understanding them, and one of the
first things you'll cover in your Machine Learning course is why it is
unlikely that will even happen (the No Free Lunch theorem, often over-applied
in internet debates but still a very, very important result).

(I mean, sure, you can _try_ to fire-and-forget, but for the most part you'll
either fail with no clue why, or fail to get very good results. Decision trees
might be a moderate exception, although even then you really ought to
understand how they are constructed and the implications of various choices
you can make.)

~~~
kurtosis
As a counterpoint, I seems to me that the ML field is too heavy on theory. A
lot of the theoretical papers are proposed with weak or no experimental tests.
There are very few people doing thorough experimental work. What I have in
mind is something like Chen and Goodman's comparison of different language
models - this is a great paper and I'd love to see more like it.

The obvious example of this is boosting where the theory lagged far behind the
experiment - everyone wanted to invent a theory to explain why boosting was so
effective.

hey industrialists out there - publish!

~~~
physcab
We're trying! I'm using machine learning to detect explosives. It's a
difficult problem, but it works quite well.

------
skenney26
The first lecture in this series mentions that the discussion groups are also
available via video. Has anyone found these?

