
A high bias low-variance introduction to Machine Learning for physicists - yoquan
http://physics.bu.edu/~pankajm/MLnotebooks.html
======
ivan_ah
This is very, very, very good.

Book is here:
[https://arxiv.org/abs/1803.08823](https://arxiv.org/abs/1803.08823)

____

My review: The authors provide a condensed summary of all central topics in
machine learning. Topics include ML basics, ML theory, optimization
algorithms, but also a detailed introduction to modern methods deep learning
methods. Code examples and tutorials are provided as jupyter notebooks for
each chapter [2]. The book uses three datasets (MNIST digit recognition, SUSY
physics data, and simulated Nearest Neighbor Ising Model) as running examples
throughout the book to help learners understand what different ML techniques
can bring when analyzing the same problems from different perspectives.

The book has a high bias (since written from physics perspective), but low
variance since assuming physics background allows authors to write a very
focussed narrative that gets to the point, and communicates three-books-worth
of information in 100 pages. This is somewhat of a repeat of the general
physics-ML-explanations-for-the-win pattern established in Bishop's `Pattern
Recognition and Machine Learning`.

The authors are wrong to label this book as useful only to people with a
physics background, and in fact it will be useful for everyone who wants to
learn modern ML. An estimator with high-bias but high efficiency is always
useful.

____

For all my hacker news peeps that wants to learn ML and/or DL, you need to
drop everything right now, go print this on the office printer, and sit
outside with coffee for the next two weeks and read through this entire thing.
Turn off the computer and phone. Stop checking HN for two weeks. Trust me,
nothing better than this will come around on HN anytime soon.

[1] book pdf =>
[https://arxiv.org/pdf/1803.08823](https://arxiv.org/pdf/1803.08823) [2]
jupyter notebooks zip => [http://physics.bu.edu/~pankajm/ML-
Notebooks/NotebooksforMLRe...](http://physics.bu.edu/~pankajm/ML-
Notebooks/NotebooksforMLReview.zip)

~~~
rerx
Wow, that must be one of the most enticing endorsements I have ever read. I am
definitely checking this out soon. (Physicist by training, now working with
deep learning)

~~~
Kroniker
There are literally dozens of us!

~~~
stochastic_monk
It fits quite well. In particular, many of our formalisms are used (if
altered) all over machine learning. Having training as a physicist gives you a
wide range of analytical skills. I’ve seen former physicists doing everything
from clinical pediatric cancer or bioinformatics to do chip design, financial
work, and ML.

~~~
sqrt17
I cringed a little at "many of our formalisms". The maths of thermodynamics is
so useful yet it took and takes substantial effort to pry the useful maths
away from the thermodynamics gobbledigook that surrounds it.

But it's certainly true that Deep Learning with its combination of mathy
underpinnings and poorly understood behaviour is something that fits very well
with a physicist's skillset.

------
pure-awesome
I get the idea that the title "High Bias, Low Variance" is some sort of
reference or pun, but I'm not quite getting it...

I mean, I know that there's a bias-variance tradeoff in stats and ML, but what
does it mean in the context of introduction to ML for physicists?

My guess is they mean they aren't going into as heavy detail in ML, which
means the reader may lack some knowledge (high bias) but won't miss the
forest/wood for the trees (low variance).

Anyone else care to speculate?

~~~
tanilama
High-bias, meaning it is opinionated from author's own perspective.

Low-variance, I think the author means to have a sharp and quality focus on
the subjects he/she is going to talk about.

~~~
yoquan
Almost agree to what you said. Just a piece of more context: the 1st author
wrote about connection between physics's Renormalization Group and Deep
learning which was featured in [1] 4 years ago.

[1]: [https://www.quantamagazine.org/deep-learning-relies-on-
renor...](https://www.quantamagazine.org/deep-learning-relies-on-
renormalization-physicists-find-20141204)

------
shamino
I love it when physicists teach. Concise and pedagogical. Gives us physicists
a good name :)

Michael Hartl (Caltech Physicist) also does a killer job with Rails Tutorial.
There's definitely a trend.

