Hacker News new | past | comments | ask | show | jobs | submit login
Probabilistic Machine Learning: Advanced Topics (probml.github.io)
183 points by abhi9u 9 months ago | hide | past | favorite | 19 comments



Related:

Probabilistic Machine Learning: Advanced Topics - https://news.ycombinator.com/item?id=30552869 - March 2022 (43 comments)


Both the intro and this one are great reference books but I don't find them suitable to study as the main textbook. They cover a large number of topics so the depth of each topic is pretty limited. Keep in mind if you are considering to study these.


I would even go further and say I did not find the intro book a good book at all.

The text was full of non-trivial errors that genuinely hindered students' understanding. Moreover, the presentation was not particularly enlightening -- lengthy mathematical discussions therein were not neither rigorous enough for a proper mathematical introduction; nor distilled enough for an application practitioners. I understand that Murphy explicitly tried to strike a balance -- I wonder if this balance ended up being in the awkward no man's land.

I do agree that I found the book better as a secondary reference due to its breadth of topics. The second book seems to continue this trend of covering even more topics.


>The text was full of non-trivial errors that genuinely hindered students' understanding

Kudos to the author for putting out a free version and for the work but the number of errors seems crazy high (I checked a couple and doesn't seem like they were fixed in the 2023-06-21 draft pdf he has put on his website), I have the 2022 book so definitely have to look into the error list.

https://github.com/probml/pml-book/issues?page=12&q=is%3Aiss...


Ironically I found it to be too deep. I want a quick feel for the mathematical structure and ergonomics of a field before really diving into 400 pages on logistic regressions.


I think it gets to a sufficient depth if you also consider the Supplementary Material and Jupyter Notebooks hosted on GitHub.

But for those with no ML background, the place to start is: https://mml-book.github.io/


what are some good book for people that want to start master in ML?


With a similar probability focus, Pattern Recognition and Machine Learning by Christopher Bishop [1] is pretty good. If you are looking into deep learning specifically, I think François Chollet's Deep Learning with Python is one of the most accessible books.

[1] https://www.microsoft.com/en-us/research/uploads/prod/2006/0...


an opinionated list of great machine learning learning resources: https://nocomplexity.com/documents/fossml/mlcourses.html


I think "Understanding Deep Learning" is very nice - https://udlbook.github.io/udlbook/ (an covers almost all topics, it has maybe just a couple of omissions, such as Multimodal Learning, NERFs and Time Series Prediction)


I find myself wishing for a book that instead of listing techniques with a few shallow examples per technique , would instead focus on a meaty problem and then apply different techniques to it iteratively showing how a practitioner would derive value. Is anyone aware of such books ?


That's just a dissertation. If you mean that also it should be written in a pedagogically sound way, I don't think I've ever seen anything like that outside of for historically important problems where a single textbook/class might be about the problem (think special relatively). The reason these kinds of things don't get written is the same reason docs for some random package don't get turned into books either: ain't no one got time for that when they're working on solving more such problems (and if you're not so expert that you're busy then you can't write the book).


Fair enough, thanks.

One of the reasons I think that might be a better way at the problem is it would encourage readers to really dig into a specific problem and get a sense for behaviors in the data, as they develop their understanding of what each technique can add.

Part of the trick would be to get a really good publicly available data set about a problem with enduring significance. Maybe sometime from an old competition that garnered a lot of interest in its day, like say the Netflix recommendation problem.

In a way, such a book would walk you through a lot of the stages of learning that the typical book presumes you would anyway do on your own in your own practice. For me, working on my own, it would fill in some of the practical questions of what someone working with colleagues would learn about by osmosis.


1200 pages? Do you really read this straight through or just refer to it? That is over whelming to me at least.


Starts off banging though dozens of probability distributions without a pause for breath. Feels like a reference material.


The introduction book and link to code is at "book1.html"


Yeah. Kevin Murphy just announced on Twitter that this is the final official version of the "Advanced Topics" book.


Judging from the table of contents, he probably could have dropped the "probabilistic" from the title and just called it "machine learning".





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: