I used 18.06 to learn the linear algebra necessary to prepare for further study in machine learning. It was sufficient for that purpose, (I got a B and an A in CMU's 10-601 and 10-605 ML courses, respectively) but this 18.065 course is more specifically geared for that purpose. The companion book is quite good, as well.
OCW link: https://ocw.mit.edu/courses/mathematics/18-065-matrix-method...
I've been reading (along with my kids) Ben Orlin's Math with Bad Drawings book which is great at motivating and presenting math in interesting ways that are all about presenting things as an interesting puzzle to be solved vs. the way much of math is presented in a rote "this is the algorithm / solution" way.
Alan (co-author of Julia) mentions that only Swift and Julia make the cut for ML, according to Google.
I do like Swift and I’m willing to learn Julia, but most of the ML I see uses Python. Is there any traction for Julia or Swift, or is it mostly aspirational at the moment?
Google has thousands of developers writing Python for ML every day. TF is in Python, and remains in Python.
Here’s the evaluation we are discussing:
Don't mistake the evaluation of an entire company with one person's opinion. In general, large companies and especially Google have parallel teams solving the same problems in different ways, and many new approaches pop up over time. The vast majority either dies or finds a small niche somewhere.
It's understandable that the author of Swift will think his language is the best for X. It's also understandable that one of the creators of Julia will find any opportunity to toot their horn. That's how the industry goes. None of this should be taken as a formal endorsement by a large company, simply because such endorsements are exceedingly rare. Consider the fact that Google created several programming languages by now (including popular ones like Go), but it still has thousands of coders writing millions upon millions of LOC of new Java code every year.
It is very early days for Swift for TensorFlow. We definitely don’t recommend anyone tries to switch all their deep learning projects over to Swift just yet! Right now, most things don’t work. Most plans haven’t even been started. For many, this is a good reason to skip the project entirely."
I don't use either and to be honest was surprised that their original set of languages evaluated was C++, Rust, Julia and Swift. Seems like an odd list of languages to start with when discussing ML.
I found these slides that look like Alan’s from a different talk. The first slide was his point:
I believe this is the "study" the presenter is summarizing, which further makes it clear that Julia did not in fact "make the cut". Put another way, if we are going to argue that Google said Julia makes the cut, we could just as easily make the argument that Rust and C++ made the cut.
 https://github.com/tensorflow/swift/blob/master/docs/WhySwif... w
Because Google is fairly well known for being a “decide as late as possible, often after multiple competing options from different teams are in place and being used in real signifcant applications” organization.
... written by Chris Lattner when he joined Google Brain and found Python lacking as a language for doing ML work. He evaluated a bunch of alternatives and decided that Swift would be a good choice, so he's now working with a team to explore that.
As far as Google at large is concerned, there was no cut of anything. Google keeps producing new ML code in Python and new ML frameworks in Python on a daily basis
In my opinion, he takes great care to motivate each topic and express his train of thought when working through problems.