
Computational Linear Algebra - julianj
http://www.fast.ai/2017/07/17/num-lin-alg/
======
thearn4
Looks like a reasonable overview of applications of dense linear algebra
operations, with very specific applications in mind.

I feel like iterative Krylov subspace methods should be around somewhere, but
to be honest I'm not sure what the applications space for linear inverse
problems looks like in the deep learning domain. So maybe that wouldn't quite
fit (despite me finding it to be pretty cool, and these methods sort of eating
the lunch of most other inverse methods).

I'd round out the course maybe with a read through of Trefethen-Bau,
“Numerical Linear Algebra” for those new to the topic area.

~~~
chris_st
So, what applications are iterative Krylov subspace methods good for? Can you
give us a sense of where they're useful, and what (in general) inverse methods
in this space are?

~~~
thearn4
The sibling comments are great, so I'll comment in a more direct application-
oriented way: Krylov methods provide a way to expose most the machinery of
linear algebra to you by means of only forward matrix-multiplication with the
relevant matrix.

A cool thing is that means you can do linear algebra operations with a matrix
(including solving linear systems) without ever needing to actually explicitly
construct said matrix. We just need a recipe to describe the action of
multiplication with a matrix in order to do more advanced things with it. This
is why they are so popular for sparse matrices, but their suitability can go
beyond just that.

Projecting problems into Krylov spaces also tends to have a noticeable
regularizing effect.

------
rabreu08
Looks like a good course. I think it would benefit if they added some module
on implementing some basic Linear system of equations solvers, like gradient
or steepest descent. Or even GMRES/MINRES or so.. The amout of knowledge that
i gained from trying to implement these was remarkable.

~~~
jph00
Gradient descent is covered. We even show a pytorch implementation :)

~~~
rabreu08
Great! It's 2AM here and i totally missed it.

I will definitely give this course a try and spread the word. Thanks

------
fdej
> Locality: traditional runtime computations focus on Big O, the number of
> operations computed. However, for modern computing, moving data around in
> memory can be very time-consuming

I need to nitpick here... Big O notation is a way to describe growth rates of
functions. You can count data movements (or anything else) with Big O.

~~~
vog
Moreover, "moving data around in memory can be very time-consuming" means that
in the end, it is still about time, not about memory.

So the correct way would be to translate memory access to time, but that means
modelling the memory hierarchy, modelling caches in general, and finally
perhaps modelling the specifically used caching strategies.

~~~
snarfy
The limits on memory access are physical, illustrated by grass hoppers famous
video about nanoseconds and the speed of light.

Should computer algorithms always assume they need to model caches since they
are never going away? When determining the computational complexity, time and
memory are treated with equivalence, but real memory doesn't and never will
behave that way.

~~~
SAI_Peregrinus
Grace Hopper, not grass hopper. The latter is an insect, the former was an
awesome computer engineer who invented the compiler.

~~~
zzazzdsa
Then again, both have something to do with bugs on some level...

------
GregBuchholz
Anyone have a comparison to the "Code the Matrix" book and lectures?

[http://codingthematrix.com/](http://codingthematrix.com/)

[https://cs.brown.edu/video/channels/coding-matrix-
fall-2014/](https://cs.brown.edu/video/channels/coding-matrix-fall-2014/)

------
lemming
For someone with an ancient undergrad math background and only "interested
observer" level of machine learning knowledge, would it be better to do this
course before tackling the deep learning one?

~~~
wodenokoto
No, start with fast.ai's deep learning course. They have a top down approach
so you start with connecting layers to solve deep learning problems and then
dig down into the theory behind the thing.

If you want a bottom up approach, go look at Ng's coursera course.

[http://course.fast.ai](http://course.fast.ai)

~~~
jph00
Absolutely right. There's very little linear algebra required in deep
learning. This Computational Linear Algebra course mainly covers
decompositions, which aren't used much in deep learning.

------
flor1s
Thanks for sharing this, it seems like a lot of interesting material is being
discussed. The audience seems to be more like the hacker news visitor than the
average student though, as it feels like little hand holding is provided.

I've just started lecture 1 but I already felt some minor frustrations:

\- One of the links in the first lecture is to a notebook about intro to
convolutions but that notebook is just a big code dump.

\- After executing the exercises, you lose the expected answer. It might be
better if the answers were included as a comment in the code fragment.

\- Sometimes the given answers are not actually the answer but just the
computation performed as part of getting the answer. I.e. for the matrix-
matrix products section in lecture 1 the suggested answer is just the
resulting matrix from doing the matrix product, but according to the question
in the text the answer should be the actual cheapest shop.

\- Is this a USF course or a fast.ai course?

I don't know if the author is planning on improving the material, because
right now it feels a bit like a beta version.

------
ceyhunkazel
Top-down approach is the best approach to teach and learn well done!

------
will_pseudonym
I'm hooked by the title and excited by teaching people about the linear
algebra gospel. Powering search engines forever.

------
unityByFreedom
I'm excited to do this after I get through as much of the DL course as I can.
Maybe that's a bit backwards but whatever.

Thanks for your hard work, Rachel! Really curious what you two will get up to
next.

------
MarkMMullin
Nice - I was taken by the "acceptable accuracy" comment, much less stomach
acid inducing than "numerical stability" :-)

------
taw55
Would one get much out of this course with only minimal ML background?

~~~
jph00
Yes it doesn't directly cover ML much. It does assume some basic linear
algebra, but provides links for where to learn that quickly.

------
kyrre
Why 'computational' and not 'numerical'?

~~~
dom0
Probably to emphasize that it is a practical course. Numerical courses, at
least those I've seen or completed, tend to focus on algorithms and their
properties, not implementation.

> Jeremy and I developed this material for a numerical linear algebra course
> we taught in the University of San Francisco’s Masters of Analytics program,
> and it is the first ever numerical linear algebra course, to our knowledge,
> to be completely centered around practical applications and to use cutting
> edge algorithms and tools,

------
stuartaxelowen
... What part of linear algebra isn't computational?

~~~
geokon
Like... most of it? You can't do linear algebra "safely" without doing error
analysis. So lots of decompositions and operations are very useful for proofs
and finding bounds - but can't be use directly to compute values. It's why a
good fraction of stuff people do with linear algebra is numerically garbage

------
ianai
I've been wanting to do a math refresher in linear or modern algebra for a
while...tempting.

