
Elements of Statistical Learning - sonabinu
http://www-stat.stanford.edu/~tibs/ElemStatLearn/
======
Eliezer
I highly endorse this book. Best explanation of linear regression I've ever
read. I think a good litmus test for stats books is whether they explain how
minimizing squared error corresponds to maximizing probability given Gaussian-
distributed errors, and, of course, whether they acknowledge the eternal light
of Bayes shining in the background of all things.

------
eykanal
I've gone through this book. I found it useful for theory, but not useful for
gaining an appreciation of how the models work in practice. Most practitioners
will appreciate a more applied book than this one.

~~~
achompas
Agreed. Any suggestions for those applied books? I haven't found any yet.

~~~
nwenzel
I found Machine Learning for Hackers to be good as a practical, case study
based book for learning to apply and understand what is going on in several
machine learning problems.

<http://shop.oreilly.com/product/0636920018483.do>
[http://www.amazon.com/Machine-Learning-for-Hackers-
ebook/dp/...](http://www.amazon.com/Machine-Learning-for-Hackers-
ebook/dp/B007A0BNP4/)

It's definitely a practical approach. There's a lot of explanation but,
despite coming from two PhD candidates, it certainly did not read like an
academic paper.

It's also case study based, not necessarily algorithm based.

As a bonus, I used that book to teach myself R. I don't think it's meant to be
an intro to R tutorial, but it worked for me.

All three are positives in my book.

------
christopheraden
ESL is pretty much the go-to text for statisticians interested in machine
learning. The math in it might be a bit more difficult than some of the pages
you see on HN, but you won't get much better than this for a survey-course on
machine learning from a statistician's perspective.

~~~
amikahmad
interesting. i've been going through gilbert strangs book and courses. this
looks like a nice accompaniment.

------
microDude
I found Christopher Bishop's book to have a good deal of knowledge in it.
Plus, he compliments many ideas with graphical figures, which is a big plus.
[http://www.amazon.com/Pattern-Recognition-Learning-
Informati...](http://www.amazon.com/Pattern-Recognition-Learning-Information-
Statistics/dp/0387310738)

Also, if you needed more information about optimization methods all of Stephen
Boyd's books are really good, just check out his entire website for
information. <http://www.stanford.edu/~boyd/>

------
gtani
other free drafts/full open content books

[http://www.ics.uci.edu/~welling/teaching/273ASpring10/IntroM...](http://www.ics.uci.edu/~welling/teaching/273ASpring10/IntroMLBook.pdf)

<http://alex.smola.org/drafts/thebook.pdf>

<http://www.p-value.info/2012/11/free-datascience-books.html>

<http://web4.cs.ucl.ac.uk/staff/D.Barber/textbook/090113.pdf> (webserver's on
verge of falling over, maybe we can stagger requests)

and i think some others are: <https://news.ycombinator.com/item?id=4672930>

~~~
gtani
The one from UCI is a very rough draft, shdn't have included that

------
misiti3780
great book, but you need to have a very high level math background to
understand most of it (in my opinion)

~~~
gngeal
Looks like completely normal math to me. It's not like they're trying to shove
category theory down your throat or anything.

A useful book! Look just like what I needed. If there's something heavier
about it, it's the terse encyclopedic feeling of it, it's not a gentle
introduction in this sense.

~~~
hyperbovine
Well that is not very helpful. If the last math class you took was 20 years
ago, ESL probably seems rather intimidating. The good thing is that most of
the book requires only calculus and linear algebra. By brushing up on those
topics first, you could be in a position to understand most of the basics of
machine learning.

~~~
gngeal
"If the last math class you took was 20 years ago, ESL probably seems rather
intimidating."

Actually, only ten years ago. :-) But I _do_ need a refresher. Just started
working on it recently. I feel actually smarter today than I felt back then,
so I don't think will hurt that much.

"The good thing is that most of the book requires only calculus and linear
algebra. By brushing up on those topics first, you could be in a position to
understand most of the basics of machine learning."

Yeah, I've noticed. The scope of the whole book is _vastly_ overblown for my
needs, my application (NLP-related) won't probably need more then the basic
stuff, but any good material at my disposal helps.

