
Gilbert Strang's New Course on Linear Algebra+ for ML Now Online - xamdam
OCW link:
https:&#x2F;&#x2F;ocw.mit.edu&#x2F;courses&#x2F;mathematics&#x2F;18-065-matrix-methods-in-data-analysis-signal-processing-and-machine-learning-spring-2018&#x2F;video-lectures&#x2F;<p>YouTube playlist:<p>https:&#x2F;&#x2F;www.youtube.com&#x2F;playlist?list=PLUl4u3cNGP63oMNUHXqIUcrkS2PivhN3k<p>Book link:<p>https:&#x2F;&#x2F;amzn.to&#x2F;2WecEkk
======
konz
Previous discussion:
[https://news.ycombinator.com/item?id=19933274](https://news.ycombinator.com/item?id=19933274)

~~~
xamdam
Yeah, just noticed. Posted this (even earlier) on Reddit ML it didn't get as
much interest

------
dsiegel2275
Strang is an excellent lecturer - his videos for 18.06 (Linear Algebra) were
instrumental for me in relearning the subject matter more than twenty years
after I last studied it.

I used 18.06 to learn the linear algebra necessary to prepare for further
study in machine learning. It was sufficient for that purpose, (I got a B and
an A in CMU's 10-601 and 10-605 ML courses, respectively) but this 18.065
course is more specifically geared for that purpose. The companion book is
quite good, as well.

~~~
liopleurodon
I second this. His videos are phenomenal

------
xamdam
Oh-oh, no linkifying in text submissions, well here we go:

OCW link: [https://ocw.mit.edu/courses/mathematics/18-065-matrix-
method...](https://ocw.mit.edu/courses/mathematics/18-065-matrix-methods-in-
data-analysis-signal-processing-and-machine-learning-spring-2018/video-
lectures/) YouTube playlist:

[https://www.youtube.com/playlist?list=PLUl4u3cNGP63oMNUHXqIU...](https://www.youtube.com/playlist?list=PLUl4u3cNGP63oMNUHXqIUcrkS2PivhN3k)

Book link:

[https://amzn.to/2WecEkk](https://amzn.to/2WecEkk)

------
netwanderer3
I love how the introductory video gives you a big picture of different ways
the math can be applied to real life scenarios. I wish they did this more
frequently at school. Often the students did not understand at the beginning
how all these stuff could be useful and thus they did not invest in the effort
or were not interested at all. Each curriculum at school should have an
introductory course that clearly outlines and explains how the content of each
course can be applied to real life scenarios, including examples of use cases
that help students understand the big picture and to motivate them.

~~~
localhost
+1. Motivation is perhaps the most important aspect of teaching. I find that
lectures are a terrible medium for teaching a concept; I find that I need time
for quiet contemplation / practice / experimentation to truly understand
something. However, lectures are fantastic for motivation and encouraging me
to do the extra work on my own.

I've been reading (along with my kids) Ben Orlin's Math with Bad Drawings book
which is great at motivating and presenting math in interesting ways that are
all about presenting things as an interesting puzzle to be solved vs. the way
much of math is presented in a rote "this is the algorithm / solution" way.

[https://books.google.com/books/about/Math_with_Bad_Drawings....](https://books.google.com/books/about/Math_with_Bad_Drawings.html?id=2G9BDwAAQBAJ&printsec=frontcover&source=kp_read_button#v=onepage&q&f=false)

------
melling
I skipped to the end and watched Alan Edelman’s talk on Julia.

[https://youtu.be/rZS2LGiurKY](https://youtu.be/rZS2LGiurKY)

Alan (co-author of Julia) mentions that only Swift and Julia make the cut for
ML, according to Google.

I do like Swift and I’m willing to learn Julia, but most of the ML I see uses
Python. Is there any traction for Julia or Swift, or is it mostly aspirational
at the moment?

~~~
anonymousJim12
I just watched the video and the presenter makes it clear that Google picked
Swift ultimately and eliminated Julia from evaluation in previous knockout
rounds.

I don't use either and to be honest was surprised that their original set of
languages evaluated was C++, Rust, Julia and Swift. Seems like an odd list of
languages to start with when discussing ML.

~~~
melling
You are saying something completely different. Google ultimately picked one
language to support. That doesn’t disqualify Alan’s point about Julia.

I found these slides that look like Alan’s from a different talk. The first
slide was his point:

[https://arpa-e.energy.gov/sites/default/files/2a%20-%20Edelm...](https://arpa-e.energy.gov/sites/default/files/2a%20-%20Edelman%2C%20Alan%20Presentation.pdf)

~~~
anonymousJim12
I am saying something completely different. The presenter in your YouTube link
makes it clear that Julia made the cut for evaluation but that ultimately
Swift was the only language left standing. The slides make this less clear as
they stop before the final knockout round.

I believe this[0] is the "study" the presenter is summarizing, which further
makes it clear that Julia did not in fact "make the cut". Put another way, if
we are going to argue that Google said Julia makes the cut, we could just as
easily make the argument that Rust and C++ made the cut.

[0]
[https://github.com/tensorflow/swift/blob/master/docs/WhySwif...](https://github.com/tensorflow/swift/blob/master/docs/WhySwiftForTensorFlow.md)
w

------
SantalBlush
Love Gilbert Strang's original linear algebra lectures on YouTube [1]. They
were a huge help while I took the course.

In my opinion, he takes great care to motivate each topic and express his
train of thought when working through problems.

[1]
[https://m.youtube.com/watch?v=ZK3O402wf1c](https://m.youtube.com/watch?v=ZK3O402wf1c)

