
Math Basics for Computer Science and Machine Learning [pdf] - oldgun
http://www.cis.upenn.edu/~jean/math-basics.pdf
======
xiaolingxiao
The professor who wrote this is Jean Gallier, and I had him for advanced
linear algebra at Penn. I am also pretty close to him in so far as a student
can be close to a professor. On a personal note, he is one of the funniest
professor I've had, and all math professors are characters.

For the people who are interested in ML, the thing to remember here is that he
is a Serious mathematician, and he values rigor and in-depth understanding
above all. A lot of his three star homework problems were basically
impossible. He writes books first and foremost so _he_ can understand things
better. In math books, there's the book you first read when you don't
understand something _, then the book you read when you understand everything.
This is book in the link.

_ for linear algebra, this:[https://www.amazon.com/Introduction-Linear-
Algebra-Gilbert-S...](https://www.amazon.com/Introduction-Linear-Algebra-
Gilbert-
Strang/dp/0980232775/ref=asc_df_0980232775/?tag=hyprod-20&linkCode=df0&hvadid=312152840806&hvpos=1o1&hvnetw=g&hvrand=13794249926302782300&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9015292&hvtargid=pla-454800779501&psc=1&tag=&ref=&adgrpid=61316181319&hvpone=&hvptwo=&hvadid=312152840806&hvpos=1o1&hvnetw=g&hvrand=13794249926302782300&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9015292&hvtargid=pla-454800779501))

~~~
jimbokun
As someone who never took an under graduate linear graduate course, videos of
Strang's lectures got me through a couple graduate machine learning courses.

[https://ocw.mit.edu/courses/mathematics/18-06-linear-
algebra...](https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-
spring-2010/video-lectures/)

What he does with chalk and a blackboard is far more effective than anything
done today with Powerpoint, fancy computer animations, or what have you.

------
Gene_Parmesan
From the start of Chapter 2:

"In the following four chapters, the basic algebraic structures (groups,
rings, fields, vectorspaces) are reviewed, with a major emphasis on vector
spaces. Basic notions of linear algebra such as vector spaces, subspaces,
linear combinations, linear independence, [...], dual spaces,hyperplanes,
transpose of a linear maps, are reviewed."

If anyone needs to start even earlier than this, I've actually found "3D Math
Basics for Graphics and Game Development" to be a good true intro for linear
algebra-related stuff. I think this would probably hold even if your primary
interest is something other than graphics/game dev. Some of the text in that
book's intro is a little cringey with its reliance on kind of juvenile game
references, but I didn't find that sort of writing continuing during the
actual text. So just push past that stuff.

I got a copy of it to act as a refresher before diving into Real-Time
Collision Detection since it's been quite a long time since formal math for me
(as in, high school, because I'm self-taught in CS). I've managed to make up a
lot of ground by working hard and finding classes to audit online (Strang's
linear alg course on OCW is a good one), but I have found that depressingly
few math texts which claim to be "introductory" are actually truly
introductory.

This isn't a slight against the linked work, I absolutely love when profs make
resources such as this freely available.

"How to Prove It" and "Book of Proof" are also great intros to formal math, if
less immediately practical.

~~~
codesushi42
I would disagree about the gamedev book reference, unless you are referring to
the real basics of linear algebra.

The really important concepts for ML are least squares, eigenvalues and
vectors, and SVD. Those concepts are not very relevant to game programming.

Well, least squares can be solved with projection, which is relevant for
converting between coordinate spaces. But game dev isn't going to give you
that intuition.

~~~
Wurdan
I believe the person you're replying to was attempting to help people like me,
who haven't had a math lesson since leaving high school (in my case over 16
years ago) and whose level of math is roughly "You want me to multiply
something? Let me get my phone."

So while the book in question might not be _the_ best resource, it probably is
a better starting point than the linked doc.

~~~
_russelldb
I wish someone would put together the "basic math" resource for people like
that. I didn't do any maths beyond 16. I was an maths idiot at school (me? the
teacher? I dunno) but since then I've worked 20 years in programming and had
to learn more and more maths just to make a living. However, my maths is a
rickety hodge-podge house, with no real foundations. I can't be a total idiot,
because I've done some decent work in distributed systems, even research in
CRDTs (which made me learn some math) but still, I'd love a self-study plan
that took me from 16 to today with the _real basics_.

Thanks for your comment, it made me feel less alone.

~~~
aashu_dwivedi
I've just finished the coursera's Mathematics for Machine Learning: Linear
Algebra ([https://www.coursera.org/learn/linear-algebra-machine-
learni...](https://www.coursera.org/learn/linear-algebra-machine-
learning/home/welcome)) and I'd say its very good for beginners

------
jointpdf
“Math Basics” is quite the misnomer—it gives the impression that one would
need to study all of the contents of this book to be an effective practitioner
in CS or ML. Memorizing every definition and theorem in this book would be
neither necessary nor sufficient for that purpose.

Keep in mind it can take an hour, and sometimes way more, to _really_ absorb a
single page of a math book like this (do the math). This is more of a
reference text.

~~~
duckqlz
> “Math Basics” is quite the misnomer—it gives the impression that one would
> need to study all of the contents of this book to be an effective
> practitioner in CS or ML.

To me basics mean that if you study this entire book you won’t be able to
understand ML otherwise it would say comprehensive. Furthermore the math
presented in this book are all taught in 1st year courses for most CS programs
I’ve encountered.

> Keep in mind it can take an hour, and sometimes way more, to really absorb a
> single page of a math.

Learning is a personal experience and happens at differing rates for different
people. While I do agree this book is rather terse and would serve as a good
reference any added explanations around the proofs would force a split to
multiple publications so I can see why the authors chose to present it in the
way they did.

Overall I have found this text easy to digest and well formulated and thank
the authors and poster.

~~~
jefft255
Which CS program? I think you’re more than exaggerating. They don’t teach
topology, FEM and abstract algebra in “most” CS program. These are just three
random examples; most of the book is wayyy more advanced than what I would
expect to learn in freshman CS.

~~~
TheOtherHobbes
This book seems to be designed as a primer for PhD research students - which
might include some talented final year undergrads heading in that direction.

If that's your level, it might reasonably be called "basic."

For everyone else - no.

------
melodrama
Great book.

I think it's a good time to mention a couple of nice books (related)

1\. Elementary intro to math of machine learning [0]. Its style is a bit less
austere than that of OP's. It also has a chapter on probability. It could
possible serve as a great prequel to the book linked in the OP.

2\. The book on probability related topics of general data science: high-
dimensional geometry, random walks, Markov chains, random graphs, various
related algorithms etc [1]

3\. Support for people who'd like to read books like the one linked in the OP,
but never seen any kind of higher math before [2]. This book has a cover that
screams trashy book extremely skimpy on actual info (anyone who reads a lot of
tech books knows what I am talking about), but surprisingly,it contains
everything it says it does and in great detail. Not even actual math textbooks
(say, Springer) are usually written with this much detail. Author likes to add
bullet point style elaboration to almost every definition and theorem which is
(almost) never the case with gazillions of books usually titled "Abstract
Algebra", "Real Analysis", "Complex Analysis" etc. Some such books sometimes
attach words like "friendly" to their title (say, "Friendly Measure Theory For
Idiots") and still do not rise to the occasion. Worse yet, a ton (if not most)
of these books are exact clones of each other with different author names
attached. The linked book doesn't suffer from any of these problems.

[0] Mathematics For Machine Learning by Deisentoth, Faisal, Ong

[https://mml-book.github.io/book/mml-book.pdf](https://mml-
book.github.io/book/mml-book.pdf)

[1] Foundations Of Data Science By Blum, Hopcroft, Kannan

[http://www.cs.cornell.edu/jeh/book%20no%20so;utions%20March%...](http://www.cs.cornell.edu/jeh/book%20no%20so;utions%20March%202019.pdf)

2] Pure Mathematics for Beginners: A Rigorous Introduction to Logic, Set
Theory, Abstract Algebra, Number Theory, Real Analysis, Topology, Complex
Analysis, and Linear Algebra by Steve Warner

[https://www.amazon.com/Pure-Mathematics-Beginners-
Rigorous-I...](https://www.amazon.com/Pure-Mathematics-Beginners-Rigorous-
Introduction/dp/0999811754)

~~~
iamcreasy
I'll check out the last one. Currently I am teaching myself real analysis.

------
SKILNER
Very first sentence of 2.1 is full of notation, symbols and terms that I, as a
prospective student, might not understand.

So many teachers seem incapable of stepping outside their sphere of knowledge
and seeing what they know and others do not. And so much work went into this.

~~~
graycat
Here's what he is doing. He wants to start with the set of real numbers,
intuitively the points on the line, usually denoted by R, maybe typed in some
special font.

Then he wants to define, say, addition of real numbers. So, given two real
numbers, x and y, that might be equal, he wants to define x + y.

So, here he wants to regard addition, that is, +, as an _operation_. Then, as
is usual for defining operations, he wants an _operation_ to be just a special
case of a function. So, he wants to call + a function. So, + will be a
function of two variables, say, x and y. With usual function notation we will
have

+(x,y) = x + y

The set of all (x,y) is the _domain_ of the function, and the set of all x + y
is the _range_.

So, that defines the function + except commonly in pure math we want to be
explicit about the range and domain of the function.

For function +, the range is just the set of all pairs (x,y) with x and y in
R. That set is also the set theory _Cartesian_ product of set R with itself
and written R x R. So, the domain of + is R x R. The range is just R. Then to
be explicit about the range and domain of function +, we can write

+: R x R --> R

which says that + is a function with range R x R and domain R.

We learned how to add in, what, kindergarten? So, why make this so
complicated?

Well, he wants to regard the real numbers as just one example of lots of
different _algebraic systems_ , e.g., groups, fields, vector spaces, and much
more, with lots of operations and, possibly, more that could be defined. E.g.,
later in his book he will want to add vectors and matrices, take an _inner
product_ of two vectors, and multiply two matrices.

So, back to addition on the real numbers, he wants to regard that as just a
special case of an _operation_ on an _algebraic system_.

IMHO there's not much benefit for making adding two real numbers look so
complicated.

Whatever he did in that chapter for defining addition on the reals, soon he is
discussing matrix multiplication with no definition at all -- assuming the
reader already understands that, that is defined and discussed many pages
later in his book.

So, in his notation

+: R x R --> R

and matrix multiplication, he is using material before he has defined it, even
before he has motivated, explained, exemplified, indicated the value of, and
defined it. In good math writing and in good technical writing more generally,
that practice is, in non-technical language, a bummer.

But from the table of contents, it appears that the book has quite a long list
of possibly interesting narrow topics. And maybe for the routine material, his
proofs and presentation are good -- maybe. I thought enough of the book to
keep a copy of the PDF. It's there; if someday I want a discussion of some
narrow topic, maybe I'll try his book!

In mathematical writing, it used to be common for the word processing to be
much more work than the mathematics! Now with TeX and LaTeX, and I'm assuming
that the book used one of these two, the flood gates are open!

~~~
mtreis86
Is there a book that explains things in the manner you have? I enjoy math but
have trouble reading it.

~~~
graycat
Most of the best math is not trivial and, thus, usually takes some effort to
understand.

There are some good authors of math for, say, calculus, linear algebra,
differential equations, advanced calculus, advanced calculus mostly for
applications, real analysis, optimization, probability, some topics in
stochastic processes, introductory statistics, various more advanced topics in
statistics.

Mostly the books are short on motivation and applications, and as a result it
is too easy to spend time on material likely not worth the time unless you can
be sure both to live forever and remember forever.

For calculus I liked Johnson and Kiokemeister. I taught from Protter and
Morrey, and it was easier than J&K. Lots of people liked Thomas.

For linear algebra, I liked E. Nering and, then, P. Halmos, _Finite
Dimensional Vector Spaces_ which really is _baby Hilbert space_ theory. Take
Nering seriously -- he was a student of E. Artin at Princeton. His treatment
of linear algebra is balanced and polished. For one of his editions, he has
some group representation theory in the back, good, and some linear
programming, really bad.

A lot of people like the MIT Strang book.

For advanced calculus to help when studying physics, especially electricity
and magnetism and engineering, I very much liked

Tom M. Apostol, 'Mathematical Analysis: A Modern Approach to Advanced
Calculus', Addison-Wesley, Reading, Massachusetts, 1957.

He has more recent versions, but for physics and engineering I like the 1957
version and don't like the later versions at all.

For ordinary differential equations, I liked

Earl A. Coddington, 'An Introduction to Ordinary Differential Equations',
Prentice-Hall, Englewood Cliffs, NJ, 1961.

He makes variation of parameters look really nice -- then can understand the
remark in the old movie _The Day the Earth Stood Still_. Ordinary differential
equations is a huge, old field, and there is some question about how much of
that deserves study now. Do notice that for _systems_ of ordinary differential
equations, get to apply some linear algebra in cute ways.

For advanced calculus for applications, there is the old MIT Hildebrand -- he
knows what he is talking about, is easy enough to read, and a good place to go
if need one of his topics.

In recent decades, the pure math departments wanted to teach _advanced
calculus_ as the theorems and proofs for freshman calculus. So there is Rudin,
_Principles of Mathematical Analysis_ , third edition (not the first two,
maybe a later edition if there is one). So here's what is going on: He wants
to develop the Riemann integral which is the one in freshman calculus. For
that he wants to integrate over a closed interval on the real line, that is,
some [a,b] which for real numbers a <= b is the set of a real numbers x so
that a <= x <= b. Rudin will argue that [a,b] is a _compact_ subset of the
reals and show that the Riemann integral exists on all compact sets. So, the
first chapters are big on compact sets. Then he talks about functions that are
continuous and then ones that are uniformly continuous. With uniform
continuity, the Riemann integral follows right away. Later he does some
infinite sequences and series and then uses these for careful treatments of
some important results, the exponential function, the number e, the sine and
cosine, etc. Later he does integration of functions of several variables on
manifolds for Stokes theorem and the related divergence theorem, a fully
careful treatment of these theorems used in E&M. He does the Cartan _exterior
algebra_ : What is going on is that he wants to integrate a function g: M -->
R where M is a _manifold_ , that is, the range of some function f from some
box, triangle, etc. to the space with the M. So, for this need the formula for
change of variable for integrating with several variables, and that is a
determinant of a square matrix. This integration is a multidimensional version
of the line integral where direction of integration is important -- the
exterior algebra is the multi-dimensional version of that. Can see that again
in some treatments of general relativity in physics.

I like Rudin's third edition: Once know what the heck he is driving at and how
he is getting there, say, as above, then his high precision is welcome.

For statistics, I suggest using some popular elementary book as a start. Then
learn probability really well and from then on study particular topics in
statistics as needed. The current directions in machine learning promise to
make lots of particular topics important.

For a first book on statistics, consider

George W. Snedecor and William G. Cochran, 'Statistical Methods, Sixth
Edition', ISBN 0-8138-1560-6, The Iowa State University Press, Ames, Iowa,
1971.

My wife did really well with that. So, get a good start on statistics and,
then, get to learn some analysis of variance (experimental design), an
underrated topic.

For a second book on statistics, consider

Alexander M. Mood, Franklin A. Graybill, and Duane C. Boas, 'Introduction to
the Theory of Statistics, Third Edition', McGraw-Hill, New York, 1974.

Here, go quickly and get only the high points and don't expect the math to be
very good -- in places it's pretty bad.

For regression analysis and linear multivariate statistics more generally,
there are several books, Maurice M.\ Tatsuoka, Donald F.\ Morrison, William
W.\ Cooley and Paul R.\ Lohnes, N.\ R.\ Draper and H.\ Smith. So, in
particular, get enough to understand that regression is a perpendicular
projection and, thus, get the Pythagorean theorem again.

For more on such statistics aimed at machine learning, get the Breiman CART --
_Classification and Regression Trees_ , maybe much of the start of ML.

With that much in statistics, will have seen a lot of applied probability and
may be ready for the real stuff. For that, need measure theory, e.g., the
first half of Rudin, _Real and Complex Analysis_ or Royden, _Real Analysis_.
Then read Breiman, _Probability_. After that might read some of Chung, Loeve,
Neveu, and maybe some more. Then return to applications including statistics
with a really solid foundation in probability, random variables, the classic
limit results, and much more. Then can read and/or write lots of advanced
topics in statistics.

For optimization, a similar review is possible, but it's getting late.

~~~
mtreis86
Thank you very much for the list. If you care for any further reviews I would
greatly appreciate it. Your list as is will take me years anyways.

------
merlinsbrain
I love that they have problems you can solve as well at the end of (almost)
every chapter.

This IS a lot of math (1,962 pages) and it’s missing a preface/introduction
which would have been helpful to understand if I need to go linear or if a la
carte is okay. At the moment I’d assume each major section is independent.

Awesome find! Wonder how It’s used. (One of) the author(s) seems pretty
prolific too -
[http://www.cis.upenn.edu/~jean/](http://www.cis.upenn.edu/~jean/)

~~~
Koshkin
> _a la carte_

Yeah, I wish we had an online resource (other than Wikipedia) anyone could
learn any sort of math from in a systematic way... Oh well.

~~~
MikeTheGreat
I'm wondering - is this a genuine request, or a snarky, implicit reference to
an online resource for learning math somewhere?

I'd love to know about the existing resource, if it exists. (The only thing
that comes to mind is Wolfram Alpha, which didn't seem 'systematic' the last
time I skimmed the main page)

~~~
house9-2
Maybe they were referring to Khan Academy?

[https://www.khanacademy.org/math](https://www.khanacademy.org/math)

~~~
pirocks
I think Khan academy for higher math/physics/chemistry /CS would be
invaluable. Maybe it could be crowdsourced in some way?

~~~
kebman
Well, there's always 3Blue1Brown:
[https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw](https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw)
Take a look at his course, Essence of Calculus:
[https://www.youtube.com/watch?v=WUvTyaaNkzM&list=PLZHQObOWTQ...](https://www.youtube.com/watch?v=WUvTyaaNkzM&list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr)
I'm not fluent in math, but I find it fascinating to watch, for some reason.
NancyPi is also great:
[https://www.youtube.com/channel/UCRGXV1QlxZ8aucmE45tRx8w](https://www.youtube.com/channel/UCRGXV1QlxZ8aucmE45tRx8w)
Oh, and don't forget the Mathologer:
[https://www.youtube.com/channel/UC1_uAIS3r8Vu6JjXWvastJg](https://www.youtube.com/channel/UC1_uAIS3r8Vu6JjXWvastJg)
What a great guy! While I'm at it, go watch Numberphile too:
[https://www.youtube.com/user/numberphile](https://www.youtube.com/user/numberphile)
Is there anyone I forgot? :D

------
meuk
Whoah, this covers a lot. I was expecting some linear algebra, calculus, and
discrete math, but there's actually some stuff in there I don't know after
doing a masters in math.

~~~
djaychela
That makes me feel somewhat better - I saw the title of 'Maths Basics' and
thought 'Great!'... then I saw it's 1,962 pages - if that's the basics, how
much is the intermediate and advanced bit?

------
abhisuri97
I love professor gallier! He's an incredible person.

That being said, this is faaaaaar beyond basics. It'd be more appropriate to
call this an incomplete (aiming to be comprehensive) guide to almost
everything you need to know in computer science (related to math).

~~~
graycat
My look at the table of contents looked like the book is short on both
probability and statistics.

------
markus_zhang
That's almost 2,000 pages of math...I don't know why and how, but somehow I
forgot most of the Statistics knowledge I obtained as a graduate student (in
Stat) 10 years ago.

I remembered that I took an advanced course about Bayesian Inference, and one
course about Multivariate Statistics (PCA, Factor analysis, these kind of
things), and my project is about Bernstein Polynomial. That's it...

~~~
xenihn
You forget complex things that you don't use regularly. Math, spoken
languages, written languages, coding...of course you can re-learn it, and re-
learning is faster than learning it for the first time.

Based on speaking to my managers in the past, it seems like a year-long lapse
is enough for you to lose an incredible amount of retained knowledge/skill.
But it's not a permanent loss.

~~~
markus_zhang
Yeah agreed, sometimes reading a research paper from the DS team would
actually ring a bell somewhere and I know where to look at. I'm re-learning
Statistics from bottom up at the moment lol but this 2,000-page book really
looks daunting. I'm pretty sure I didn't take any advanced optimization course
back in university.

------
emmanueloga_
In basic calculus one can burn countless hours memorizing mechanical rules to
derive and integrate different function forms, or one can just plug the
function into something like wolfram-alpha and get, for a lot of useful cases,
a symbolic answer, or at least some approximate answer for a point or
interval.

The point is, understanding integrals and derivatives doesn't require one to
memorize all the mechanical rules. Using software to compute those functions
can be a huge time saver. No one should go with pen an paper double checking
if that polynomial integral is correct or not!

With a book almost 2000 pages long, I wonder if this books leans more heavily
on the mechanical-rules side of math. In my mind, is the difference between
writing a book such that you can _write your own wolfram alpha_ , or writing a
book so you can just _use it_.

~~~
MAXPOOL
You don't need to memorize rules when studying math. Just like you don't need
to spend any time to memorize syntax for programming languages. You
automatically remember things you use a lot.

Once you have spent countless hours doing exercises to the extent that you
understand the math, you already remember the rules. If you have not spent
countless hours doing exercises, you don't understand anything at this level.

You don't hire a programmer who has read all the books and 'understands'
programming but has never programmed. It's the same with math. You don't just
read a math book from start to finish. You can use wolfram alpha for
visualizing functions, not for learning math.

~~~
plinkplonk
\+ 100. I can't upvote this enough.

Programmers have spent countless hours practising programming to the point
where they have forgotten how difficult it was in the beginning. A non
programmer might think of programming as "memorizing hundreds of rules" to get
anything done, but one doesn't learn programming by sitting around explicitly
memorizing hundreds of rules and _then_ begin to program.

Actually writing programs with a minimal set of 'rules' memorized and then
adding more as needed is how one typically learns programming.

~~~
kebman
I've been teaching programming for five or six years now. I always start with
HTML, then add CSS, and then add JavaScript. That way they experience mastery
all the way, and see how they can be creative with the code. It's so great to
see a pupil "get it" \-- and sometimes even pupils that "suck at math" or even
pupils who have problems spelling the most basic sentence correctly. In fact
I've found that there's a strange correlation where pupils who have dyslexia
often seem to be better than others at programming.

~~~
commandlinefan
> I always start with HTML, then add CSS, and then add JavaScript

So when do you get around to teaching programming, then? ; P

/ducks

~~~
Koshkin
Starting with HTML and JavaScript is one of the most efficient ways to get
children interested in programming.

------
krosaen
Haha 1900 pages on "basics"?

I suspect there are better resources for each topic covered (e.g Gilbert
Strang books and OCW lectures for Linear Algebra), but it is definitely
interesting to peruse and get a sense of relevant topics.

------
jvehent
"Math Basics"

It's 2000 pages long....

~~~
kouh
Chapter 1 can't be more basic, a great summary of what will remain intuitively
after reading the whole opus

~~~
euph0ria
lol

------
j7ake
Even if you were ambitious and manage to read 5 pages per day everyday this
would take you more than one year to read this from start to finish.

------
TimMurnaghan
Nice to see wavelets in here - but it's a shame that he seems to be
encouraging people to actually use Haar wavelets. They're fine for teaching -
but there are usually better choices in real life. Daubechies are a good
default

------
floki999
The writing style of this book i.e. rigorous math notation and
proposition/proof presentation is going to put off the great majority of
potential CS and ML readers. At almost 2000 pages it sure makes a great door-
stop though.

------
impaktdevices
Me: Oh, good! I've always been pretty good at math but I want to learn how to
make sense of the math I encounter in CS and ML.

[Reads the first paragraph of the 2nd chapter]

Me: I don't know anything about math. At all.

------
amthewiz
Basics should be concepts that get you to 80% and tell you where to look for
the rest 20%. This book tries to get you directly to 95% and is best treated
as a reference book.

------
laichzeit0
Strangely, probability theory is completely omitted.

~~~
kebman
Is there a probable cause?

------
ps101
I really can't figure out who the target audience for this book is, if it has
a target audience at all.

------
decotz
403 forbidden. Can someone host this?

~~~
ccffph
[https://web.archive.org/web/20190730230113/https://www.cis.u...](https://web.archive.org/web/20190730230113/https://www.cis.upenn.edu/~jean/math-
basics.pdf)

------
jaimex2
I look at this and profoundly thank the people who make ml libraries for us
the rest of us.

~~~
vecter
The vast vast vast vast vast majority of this book (which is more of a
reference and encyclopedia than an actual book for learning) is not required
for implementing most ML libraries.

------
strikelaserclaw
This is more like courses a talented undergrad math major would take through 4
years.

------
estomagordo
Okay, so this looks potentially awesome. But given that it is a reference work
rather than some introductory "basic" little quick read-through, I'd prefer to
have it in paper form.

Any hope of that happening?

------
currymj
this is an incredible reference for a machine learning researcher who wants to
fill in some gaps in their existing mathematical knowledge.

But I would be shocked if this would be of any use for someone trying to learn
a little linear algebra in order to play with neural networks. For that I
think you still want Strang.

I think "foundations" might have been a better word than "basics" here.
"Basics" in any case is not in the printed title, only in the filename.

------
parasdahal
403 forbidden, can someone help us out?

~~~
itchyjunk
It now redirects to google drive. Simply refresh. It redirects to [0].

[0]
[https://drive.google.com/file/d/1sJvLQwxMyu89t2z4Zf9tD7O7efn...](https://drive.google.com/file/d/1sJvLQwxMyu89t2z4Zf9tD7O7efnbIUyB/view)

------
iserlohnmage
Can someone recommend me a book on Linear Algebra, Statistics and Probability?

~~~
neighbour
Second this. If it has exercises, even better. About to undertake my MSc Comp
Sci and would like to learn Discrete Math as I'm coming from a non-math
background (did B.IT at school and only took two math courses).

------
bigred100
That’s... quite a lot of math

------
tempodox
This book looks great, I've been looking for a resource like that.

------
mjortberg521
Great to see Prof. Gallier featured on here!

------
sgt101
Math "Basics" in nearly 2k pages!

------
ForFreedom
Q: Is this all necessary for ML?

~~~
ovi256
No, you can be an ML practitioner with just an intuitive understanding of,
say, gradient descent works and you would do fine. You can even pick up that
intuitive understanding on a strictly need-to-know basis, when it's needed for
learning an ML technique. That's what fast.ai teaches.

For being more than a practitioner, like an implementer of new ML libraries or
a researcher, of course you'd need to know more.

------
gantkimthis
is Linear Algebra something you need to work with machine learning?

------
planetabhi
This is a good book

------
ppcdeveloper
This is nice.

------
manca
Wow!

