Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Gilbert Strang's New Course on Linear Algebra+ for ML Now Online
161 points by xamdam 30 days ago | hide | past | web | favorite | 23 comments
OCW link: https://ocw.mit.edu/courses/mathematics/18-065-matrix-methods-in-data-analysis-signal-processing-and-machine-learning-spring-2018/video-lectures/

YouTube playlist:

https://www.youtube.com/playlist?list=PLUl4u3cNGP63oMNUHXqIUcrkS2PivhN3k

Book link:

https://amzn.to/2WecEkk





Yeah, just noticed. Posted this (even earlier) on Reddit ML it didn't get as much interest


Strang is an excellent lecturer - his videos for 18.06 (Linear Algebra) were instrumental for me in relearning the subject matter more than twenty years after I last studied it.

I used 18.06 to learn the linear algebra necessary to prepare for further study in machine learning. It was sufficient for that purpose, (I got a B and an A in CMU's 10-601 and 10-605 ML courses, respectively) but this 18.065 course is more specifically geared for that purpose. The companion book is quite good, as well.


I second this. His videos are phenomenal



I love how the introductory video gives you a big picture of different ways the math can be applied to real life scenarios. I wish they did this more frequently at school. Often the students did not understand at the beginning how all these stuff could be useful and thus they did not invest in the effort or were not interested at all. Each curriculum at school should have an introductory course that clearly outlines and explains how the content of each course can be applied to real life scenarios, including examples of use cases that help students understand the big picture and to motivate them.


+1. Motivation is perhaps the most important aspect of teaching. I find that lectures are a terrible medium for teaching a concept; I find that I need time for quiet contemplation / practice / experimentation to truly understand something. However, lectures are fantastic for motivation and encouraging me to do the extra work on my own.

I've been reading (along with my kids) Ben Orlin's Math with Bad Drawings book which is great at motivating and presenting math in interesting ways that are all about presenting things as an interesting puzzle to be solved vs. the way much of math is presented in a rote "this is the algorithm / solution" way.

https://books.google.com/books/about/Math_with_Bad_Drawings....


I skipped to the end and watched Alan Edelman’s talk on Julia.

https://youtu.be/rZS2LGiurKY

Alan (co-author of Julia) mentions that only Swift and Julia make the cut for ML, according to Google.

I do like Swift and I’m willing to learn Julia, but most of the ML I see uses Python. Is there any traction for Julia or Swift, or is it mostly aspirational at the moment?


Just to be clear, the "according to Google" part is completely off.

Google has thousands of developers writing Python for ML every day. TF is in Python, and remains in Python.

Chris Lattner (the creator of Swift at Apple) joined the Tensorflow team and is working on futuristic approaches to compiling ML programs. He (personally) evaluated a bunch of languages for next-gen and decided Swift is the right one. So he has a small team of folks working on that project exploring how Swift can be used. That's very far from any official decision of Google on anything. Google has people writing ML in Java, C++, Python, Haskell, Lua and Javascript as well.


Alan’s words, not mine.

Here’s the evaluation we are discussing:

https://github.com/tensorflow/swift/blob/master/docs/WhySwif...

HN comments:

https://news.ycombinator.com/item?id=19129641


Yes, that evaluation was committed by Chris Lattner -- see https://github.com/tensorflow/swift/commit/028f245ef3ca735b1...

Don't mistake the evaluation of an entire company with one person's opinion. In general, large companies and especially Google have parallel teams solving the same problems in different ways, and many new approaches pop up over time. The vast majority either dies or finds a small niche somewhere.

It's understandable that the author of Swift will think his language is the best for X. It's also understandable that one of the creators of Julia will find any opportunity to toot their horn. That's how the industry goes. None of this should be taken as a formal endorsement by a large company, simply because such endorsements are exceedingly rare. Consider the fact that Google created several programming languages by now (including popular ones like Go), but it still has thousands of coders writing millions upon millions of LOC of new Java code every year.


Fast.ai is considering switching to Swift for future courses[1]. Could we be seeing the beginning cracks of Python's dominance in the ML space?

1) https://www.fast.ai/2019/03/06/fastai-swift/


"The combination of Python, PyTorch, and fastai is working really well for us, and for our community. We have many ongoing projects using fastai for PyTorch, including a forthcoming new book, many new software features, and the majority of the content in the upcoming courses. This stack will remain the main focus of our teaching and development.

It is very early days for Swift for TensorFlow. We definitely don’t recommend anyone tries to switch all their deep learning projects over to Swift just yet! Right now, most things don’t work. Most plans haven’t even been started. For many, this is a good reason to skip the project entirely."


I just watched the video and the presenter makes it clear that Google picked Swift ultimately and eliminated Julia from evaluation in previous knockout rounds.

I don't use either and to be honest was surprised that their original set of languages evaluated was C++, Rust, Julia and Swift. Seems like an odd list of languages to start with when discussing ML.


You are saying something completely different. Google ultimately picked one language to support. That doesn’t disqualify Alan’s point about Julia.

I found these slides that look like Alan’s from a different talk. The first slide was his point:

https://arpa-e.energy.gov/sites/default/files/2a%20-%20Edelm...


I am saying something completely different. The presenter in your YouTube link makes it clear that Julia made the cut for evaluation but that ultimately Swift was the only language left standing. The slides make this less clear as they stop before the final knockout round.

I believe this[0] is the "study" the presenter is summarizing, which further makes it clear that Julia did not in fact "make the cut". Put another way, if we are going to argue that Google said Julia makes the cut, we could just as easily make the argument that Rust and C++ made the cut.

[0] https://github.com/tensorflow/swift/blob/master/docs/WhySwif... w


Is it “Google” or a single permission-to-fail R&D team at Google headed by the creator of Swift that focussed on languages with features like those of Swift and ultimately settled on Swift?

Because Google is fairly well known for being a “decide as late as possible, often after multiple competing options from different teams are in place and being used in real signifcant applications” organization.


It's certainly the latter. There's a small research team at Google evaluating whether Swift is a good candidate to replace Python as the de-facto language for ML. That's all there is to it.


Can anyone comment on how well developed the CUDA support is on Julia? Compared to Python? I'm using Python/Numba for my scripts.


From my experience (and this seems to be a general trend with Julia projects) CUDA support in Julia - is not completely stable - is very fast moving - has a low learning curve - is highly amenable to your own tweaks and hacks


Curious what your source for "make the cut" is. I have my own interpretation, but would like to see the original.


As written above, the cut is this evaluation -- https://github.com/tensorflow/swift/commit/028f245ef3ca735b1...

... written by Chris Lattner when he joined Google Brain and found Python lacking as a language for doing ML work. He evaluated a bunch of alternatives and decided that Swift would be a good choice, so he's now working with a team to explore that.

As far as Google at large is concerned, there was no cut of anything. Google keeps producing new ML code in Python and new ML frameworks in Python on a daily basis


Love Gilbert Strang's original linear algebra lectures on YouTube [1]. They were a huge help while I took the course.

In my opinion, he takes great care to motivate each topic and express his train of thought when working through problems.

[1] https://m.youtube.com/watch?v=ZK3O402wf1c




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: