It's not clear to me who the target audience is for this book. As someone with a graduate degree but not a mathematical background, I'm comfortable reading academic papers, but a lot of this feels over my head. So it reads to me like it's a quick refresher or reference for people who already learned this stuff at some point, rather than an introduction. But the preface says this is to be "a book on Mathematics for Machine Learning that motivates people to learn mathematical concepts."
As a sidebar, it has always seemed to me that there is a giant gulf between truly beginner-friendly math books, which are aimed at children, and introductory math books aimed at adults. The latter almost always read like foreign language textbooks where you must first know the language before you can start, while the former are too elementary. I'd love to find "College level Math for English Majors" or something of the like, if anyone knows of such a book :)
Ok, I don't know about book but 3blue1brown on youtube has some really cool videos on intermediate maths. He's videos are more about understanding the underlying concepts rather than formulas.
It looks to me like it's pitched approximately at third-year undergraduates who have a couple of years of a college math background, and is meant to pull them up to a level where they can fluently work with the more advanced concepts used in machine learning.
The route for someone with no (or very little) math background is a lot longer, and I don't think this book is trying to provide it. I think that one either has to 1) bite the bullet and learn a year or two of undergrad math first, which provides the necessary foundation for this stuff, then learn this for real; or 2) be content with understanding and using machine learning at a hand-wavy level (which I am not denigrating). It might be nice to have a "hand-wavy machine learning" book around (there are certainly enough blog posts of that sort), but this isn't trying to be it.
The course requires a 1st year linear algebra course and a 2nd year statistics course, which might explain why this book doesn't cover some of the basic concepts
I put a lot of effort into making them with an "easy ramp up" so anyone with basic math background can pick up. If you don't remember you high school math in details, you can still handle. You can buy print versions on amazons or ebook on gumroad. See website for links: https://minireference.com/
There are some decent intro books for adults out there if you don’t care about the branch. Honestly, if you’re looking for intro math books you really shouldn’t care too much about the branch. I think what’s more important is to develop mathematical thinking.
Here’s some off the top of my head. I really think these will help you build a good foundation for mathematical thinking.
Strang does a GREAT job explaining the intuition behind linear algebra. The book is targeted at first-year undergrads and an adult with just high-school math could work through it.
Bonus: if you're interested in machine learning, you must learn linear algebra.
More advanced is this Strang masterpiece: Intro to Applied Mathematics.
http://bookstore.siam.org/wc02/
It covers a lot of the applied math that we focused on pre-AI and pre-cloud computing. Diff eqs, diffusion equations, Fourier analysis, numerical methods, phase plane analysis, optimization, complex anlaysis.
I can't recommend Strang enough. I tend to learn better when I can visualize math concepts, and had such a hard time understanding Linear Algebra until I finally bought a copy of that book. I love that thing, and it still sits in my bookshelf years later.
I specifically mentioned those because they have little to no prerequisites other than high school math which I interpret can be read by any non-math major. They’re very self-contained but have very little hand-holding. I think just absorbing a little of these books will establish a decent foundation.
Concrete Mathematics is pitched at graduate students in computing. Spivak’s Calculus is an introductory real analysis book pitched at undergraduates who have gone through a computational calculus course already and want to study the subject more formally and rigorously; it has many difficult problems and would generally benefit greatly from the structure and expert feedback of a university course. Jaynes’s book is probably most relevant to science students who are at least at the advanced undergraduate level. How to Solve It is a dictionary of heuristic problem-solving techniques which is most useful to someone who is already (deeply) familiar with mathematical problem solving, and wants to codify their existing methods. Even advanced undergraduate math students who read it aren’t going to fully understand the book IMO; I would recommend Pólya’s other books (Mathematics and Plausible Reasoning, Mathematical Discovery), or maybe start with a gentler book like Mason, Burton, & Stacey, Mathematical Thinking.
Concrete Mathematics is solidly an undergraduate text. Much of the material in it would already be taught well before graduate school. The preface actually states the book takes comes course material taught to graduates and junior/senior undergraduates and presents it for a “wider audience (including sophomores).”
Otherwise I basically agree with your comment. I just take issue with calling Concrete Mathematics a graduate textbook, because I hear people say that as though it’s not an appropriate recommendation for learning. That gives me the impression they’ve not actually opened up a graduate textbook in math or computer science. Concrete Mathematics might not be year one material, but you can do it after a calculus course and maybe an algorithms course. Contrast this with an actual graduate course, like convex analysis and optimization. Textbooks at that level would definitely not be accessible for most undergrads.
Fair enough. It’s still not an easy book for someone to self-study after having no university-level mathematics, just as a side hobby.
I would certainly recommend giving it a shot for anyone interested, as it’s a lovely book full of fun problems. As you say, it’s accessible to well prepared undergraduates.
The target audience would be people that have roughly 2 years of undergrad math, the 4 semester calc sequence or high school equivalent, probability/stats, linear algebra, some computational courses using e.g. Numerical Analysis by Burden/Faires.
If you look in Goodfellow et al's Deep Learning book, Murphy's Machine Learning text and others mentioned here (Learning from Data, Shalev-Shwartz/Ben-David) the prereq's are always some variation of above and I think you could do a lot of the above at U.S. community colleges, at least the CC's around me.
Frankly, you'd have to do a bunch of self study beyond CC and there's no shortcut/royal road. So the key is self study, that's a discipline anyone that wants to do Data Science/machine learning for real needs
I felt the same. It is a decent enough refresher though!
Machine learning lends itself to easily learning additional pieces of math once you have a nice foundation, and it is nice enough that the foundation is pretty small---vector calc, (mostly) real analysis, linear algebra (it helps if you know infinite but orthogonal eigenfunctions), little bit of physics knowledge (statistical models and hamiltonians), and a little bit of differential geometry.
Unfortunately, that foundation is slowly getting bigger. Read some papers in topological data analysis recently. Category theory is crawling in!
At this point we might benefit a lot from a forum where mathematical concepts are discussed for a particular application in the most explicit ways and not just question and answers. We can then build up on that to further the mathematical concepts and also to rope in more people into proper Machine Learning domain. Is this objective a little too unrealistic to obtain?
What does “most explicit ways” mean? Do you mean in the most straightforward language possible, with as much jargon as possible removed? Do you mean extremely patient exposition? Do you mean a first principles approach that builds advanced probability theory, linear algebra and statistics from set theory, then dives into machine learning?
Murphy’s Machine Learning: A Probabilistic Perspective is just over 1000 pages long. It’s an exceptionally good book for the mathematical theory behind machine learning. What would this book look like under your proposal? If we add in Axler’s Linear Algebra Done Right and Chung’s A First Course in Probability Theory, we’re adding another 350 pages or so each. Both of those books have prerequisites, and Murphy’s book isn’t the top either.
In my opinion your proposal is unrealistic, yes. I don’t mean for that to sound harsh, I just don’t see a way to achieve what you’re asking for without making a monolithic tome larger than Knuth’s The Art of Computer Programming, and potentially even more diverse in scope.
The general way to master a large pile of difficult abstractions is to study for years, ideally in a context focused on that study. This can be done independently of any expert feedback (e.g. by reading books, watching lecture videos, and doing a big pile of ungraded self-assigned homework problems), but it’s not easy or typically very efficient. For the majority of people who want to deeply learn mathematical topics but who are starting from a low level, the most efficient and effective advice for them is to go attend a university.
An online discussion forum isn’t typically going to cut it, IMO. People won’t be at the same level, won’t be working on the same topics, won’t have the same goals and interests, the same level of commitment, coordinated schedules, etc. This has been tried many times, but I have not heard many successful reports.
Online discussion forums are great for handling independent limited-scope questions–answers (including for current university students), and okay for coordinating research efforts among people who are already experts, but in general are not very effective for getting a group of people to diligently work through years of prerequisite material in an organized fashion.
You shouldn’t spend your life in a university, but if you have a goal of learning advanced mathematics, you might benefit from spending a few years there. YMMV.
As an alternative, if you can find an expert tutor/coach to meet with one-on-one on a regular basis, that is even better than a formal course. But that is expensive and hard to scale.
It is also very possible for folks to think they have it right, and go off in entirely wrong directions. In my experience it is likely, even in quite good and hard-working students.
I teach an inquiry-based proofs class and so I sit through the students's discussion. Even on material that would be considered trivial in many of the books mentioned here, there are people saying aloud things in that discussion that are just simply wrong. It is not that these folks are dummies, it is that this stuff is hard.
I'm not a people person but one thing I've seen from this class is that a lot of learning is about community. It makes a big difference.
You shouldn’t spend your life in a university, but if you have a goal of learning advanced mathematics, you might benefit from spending a few years there. YMMV.
The problem is, that simply isn't possible / practical for many people. Assuming we're talking about people who are career professionals who are already past their first (and maybe only) stint at a university, most people just don't have the free time (and other resources) to sit in classes at a university. Especially since most universities still cater almost exclusively to "traditional" students who take classes in person, during the day.
If universities made more night classes available, or more online/hybrid classes, I could buy this. But for a lot of us, the best we can do is some combination of "watch Youtube videos, do MOOC's, self-study, do ungraded problems, and ask for help on Internet forums".
That said, not all problems you do on your own have to be exactly "ungraded". For many mathematical subjects you can find books of exercises that include the answers so you can "grade" yourself (assuming you can resist the temptation to cheat and look at the answers first), or you can find previous years problem sets and exams on many course websites.
Yeah, learning the stuff outside of a university setting is probably harder in some sense, but it's not impossible. And if that avenue is the only one available to somebody, then it is - by definition - the best avenue available to them.
Most of those people are not going to get all that far learning advanced mathematics, unless they spend serious amounts of time and effort. But sure, I wouldn’t want to discourage them from giving it a shot.
Personally I am not an academic, I never went to grad school, and I do lots of self-studying of various technical topics, up to hopefully some reasonable level of understanding.
This might sound callous, but most of us are not going to master advanced mathematics or theoretical computer science.
If you want to self-study, all the material you could ever hope to need is available in a mulitude of books targeted to various levels of prior experience and with many different styles of exposition. If you want to study with others, this is precisely what college is for. People who don’t want to study in a college setting basically don’t want to study in a group setting in general, because the college setting actually is the most efficient group setting.
I think most people underestimate the sheer vastness of the material you have to learn and the requisite time you need to learn it. Reading isn’t enough, you have to do it - over and over. It’s frankly a slog, and there’s no more efficient way than being immersed in coursework.
The entire benefit of a course structure is the instructor. The community forum helps, but really it’s an extension of the instructor. A qualified instuctor can quickly resolve a complex question that might take you hours or days to (maybe incorrectly) answer yourself. If you’re capable of learning genuinely unfamiliar math without an instructor, that means you have sufficient mathematical maturity that you don’t need a group forum. At that point it’s actually more efficient to self-study.
I get the sense people believe mathematicians are pretentious when they say things like this, but I think that’s uncharitable. They typically give advice like this because they have a significant amount of personal experience and know very well what tends to work (and how efficiently it works).
For those interested in a similar resource, CMU offers a "Mathematics for Machine Learning" preparatory course each Fall semester. All of the videos from the 2017 edition are available to watch on YouTube.
Not sure if the authors will read this or not but I beg of you, please put a table of notation in the forward. The Sutton and Barto Reinforcement Learning book did that for basically every notation that wasn't basic algebra and it's been extremely helpful.
Just labeling things I had never seen before, like indicator functions, was extremely valuable.
Especially for this kind of book that is introducing mathematics to people from a broad background - I think it's important to understand how much of an impediment not knowing notation is by sight. Trying to Google or search for notation is a nightmare.
Yes, this is so useful when it is present. Whenever I read an even moderately mathematical paper I end up making one myself (including the meanings and types of all the variables).
Here is my feed back, this book appears not to be useful and doesn't seem to provide any intuition to why we use any of the linear algebra to do work with machine learning ... it seems like a undergraduate book filled with descriptions of math. I find there needs to be a bridge to explain why we use vectors, how we model then relate it back to the math.
Nowadays, many books cover the elementary mathematics in machine learning. After I learnt these elementary topics, any good suggestions for computational learning theory?
I recommend Shai Shalev-Schwartz and Shai Ben-David's Understanding Machine Learning: From Theory to Algorithms [0]. I've also used and found Mohri's Foundations of Machine Learning quite insightful [1]. Usually, between the two books, at least one's proof is easy to follow.
Get both unless you're only getting one, in which case, get Shai Shalev-Schwartz's.
Why delete the comment? Useful context for stochastic_monk's reply is missing now. You could always preserve the existing and possibly incorrect comment and append an "Update" or "Edit" section to the comment to override your earlier comment.
Essentially, he asked if the above poster had read Elements of Statistical Learning, Murphy's ML textbook, Bishop's PRML, Reinforcement Learning: An Introduction, and Ian Goodfellow's Deep Learning textbook.
I simply clarified that the question was about computational learning theory, a subfield largely started by Leslie Valiant in the form of PAC (Probably Approximately Correct) learning. The difference in emphasis between the machine learning conferences I mentioned helps point out how practical machine learning (like ICML, matching PRML/ML/ESL) and feature extraction/representation learning (like ICLR, perhaps matching portions of both ICML and ICLR), while important, are not what the previous poster was asking about.
He's specifically asking about learning theory which is a subfield of machine learning, along the lines of work you'll see at COLT, using concentration bounds, VC theory, and Rademacher complexity. PRML, Murphy, ESL, the Deep Learning book, and the RL introduction are more like what you'd see at ICML or ICLR.
I've mentioned this on HN before but I think its still relevant to people interested in learning ML who feel they are behind on the math. If, like me, you can't sit thru lots of pages of mathematics text and instead prefer that a human explains it to you via videos that you can replay, here is a list of courses that take you from basic algebra and pre-calculus math all the way to the concepts you need to understand the principles behind the most advanced ML algorithms. All explained by very energetic people who are experts in their fields, and starting from the very basics.
This covers calculus, linear algebra, probability, statistics, convex optimization and a math for ML course thrown in for the HN audience:
(The first two are "MOOCs" recorded in the 1970s! probably the first ever recorded MOOC, even before the internet, and the lecturer is absolute gold)
These are great resources, but the ultimate approach is wrong. In order to truly learn math, you must be willing to invest the time and "sit through lots of pages of mathematics text" and also do a ton of problems, many of which will take hours and some which will require days of thinking.
I you aren't willing to invest the energy and effort to do that, all the video watching won't do anything for you. It will get you 5% of the way at best. The true learning comes from staring at a problem for hours, trying 100 dead ends, and then finally having an insight two days later while taking a shower that suddenly make that intractable problem seem trivial.
These are great resources, but the ultimate approach is wrong. In order to truly learn math, you must be willing to invest the time and "sit through lots of pages of mathematics text" and also do a ton of problems, many of which will take hours and some which will require days of thinking.
These things aren't mutually exclusive. I don't know about anybody else, but I'd rather watch a video of a human explaining the subject, then sit down with a textbook and start working through problems.
As a sidebar, it has always seemed to me that there is a giant gulf between truly beginner-friendly math books, which are aimed at children, and introductory math books aimed at adults. The latter almost always read like foreign language textbooks where you must first know the language before you can start, while the former are too elementary. I'd love to find "College level Math for English Majors" or something of the like, if anyone knows of such a book :)