To put it in perspective Michael Jordan is one of the pioneers of the modern approach to statical machine learning. It's very cool to see his course notes from 12 years ago when many schools didn't even offer basic machine learning courses.
Learning theory is a math-heavy (proof-heavy) subfield of machine learning that studies what's possible and why some of the methods work as well as they do. Unless you have a strong math background and are already fairly well-versed in machine learning, I'd first take practical classes. Learning theory is the last class I took in grad school and I really enjoyed it.
Earlier this year, I retired (voluntarily) from a large company after 32 years. Now that I have time to breath/think/sleep, I actually 'audit' many of the on-line courses I find mentioned here. Great fun. Thanks!
The reason why the class from 2014 doesn't have a lot of the material is because a big part of it is based on Prof Jordan's future book on Statistical Learning Theory.
Another great study of his: http://www.nowpublishers.com/article/Details/MAL-001
Learning theory is a math-heavy (proof-heavy) subfield of machine learning that studies what's possible and why some of the methods work as well as they do. Unless you have a strong math background and are already fairly well-versed in machine learning, I'd first take practical classes. Learning theory is the last class I took in grad school and I really enjoyed it.