Hacker News new | past | comments | ask | show | jobs | submit login
A Geometric Review of Linear Algebra [pdf] (nyu.edu)
100 points by ivan_ah on Feb 22, 2014 | hide | past | web | favorite | 27 comments

The A in the title is for AWESOME.

I have been writing about linear algebra almost non-stop for the past two months to organize the best possible "playlist" of topics that cover a linear algebra course. It's a hard problem, this linearization of a graph of dependencies into a stream of .tex

Prof. Simoncelli has a very good geometrical take on the problem. He manages to cover all the essential topics in 10 pages. Geometrically speaking, vectors = arrows and matrices = transformations on arrows.

All the essential topics? I miss eigenvalues (and probably determinants as a stepping stone to that). Dunno if the most pedagogical approach would be before or after SVD though.

At least from a non-software engineering context, you're usually taught eigenvalues first as they are quite useful for a lot different problems... basically all linear systems.

At least the way our curriculum was arranged, in first year had a 'simple' linear algebra course that consisted of solving systems of linear equations (like Gaussian Elimination), and 'simple' things with matrices and vectors, which then ended in this rush to teach eigen-decomposition, which we pretty much just learned by rote. I don't believe anyone really had a feel for what we were actually doing.

But the flip side was that we learned eigen-decomposition in time for all the 'fun' applications of it, and by the time our far more rigorous linear algebra course occurred in 3rd year (which ended with SVD), when we went over eigen-decomp again, it was like a brilliant 'OH, I GET ITT' moment for everyone in the class.

> I miss eigenvalues

They're not missing just judiciously omitted. Actually seeing SVD presented in an introduction is what inspired me to post this.

The SVD approach works for an matrix A in R^{m x n}

   A = UΣV^T
But if A is square and you allow the singular values to be +/-, then you can find a single matrix U that serves as a left-basis and a right-basis for A. Hence we obtain the eigendecomposition:

   A = UΛU^T
So SVD is the mother ship, and eigenstuff is a special case.

He touches on it at the end with Singular Value Decomposition, but didn't mention that you can use SVD to get eigenvectors/values. He also doesn't cover how to do SVD. This could definitely be covered more in depth, but maybe not in a single semester undergraduate course. I think a follow on set of notes that starts with eigenvectors/values and continues into the more fun parts of linear algebra would be very useful.

Definitely don't need determinants to learn everything you need to know about eigenvalues.

I would love to see your take on this dependency graph, no matter how subjective and incomplete it turns out to be.

    {vecops} = {Vector operations}

    {RREF}   = {Reduced row echelon form}

    |  {patience}
    |  |  {dotproduct}
    |  |  |
    {Mops}   = {Matrix operations}
    {Meqns}  = {Matrix equations}
    {Mmult}  = {Matrix multiplication}
    {Mdet}   = {Determinants}
    {Minv}   = {Matrix inverse}

    {linear} = {Linearity}
    {plane}  = {Lines and planes}

    {proj}   = {Projections}
    {basis}  = {Coordinate projections}

    {VS1}    = {Vector spaces}
    {VS2}    = {Abstract vector spaces}           // knowledge buzz moment
    |  {dotproduct}
    |  |
    {IPS}    = {Abstract inner product spaces}

    | {Mmult}
    | |  {function  f:R->R}
    | |  |
    {linT}   = {Linear transformations}
    {VSRREF} = {Vector space techniques for matrices}
    {TasM}   = {Finding matrix representations}
    {basis2} = {Change of basis for matrices}

    |  {linT}
    |  |
    {eigV}   = {Eigenvalues and eigenvectors}
       {Mdecomp}= {Matrix decompositions}
    {specM}  = {Special types of matrices}

    {invMthm} = {Invertible matrix theorem}          

    {GSorth} = {Gram--Schmidt orthogonalization}

    {LP}     = {Linear programming}

    {solve}  = {Solving systems of equations}
    {LSaprx} = {Least squares approximate solution}
    {ML}     = {Machine learning}

    {LAoverZ_q}= {Linear algebra over a finite field}
    ||{ECCs}   = {Error correcting codes}
    |{crypto} = {Cryptography}
    {netcode}= {Network coding}

    {fourier}= {Fourier series}

    | {prob}   = {Probability density}
    | |  {LAoverC}= {Linear algebra with complex numbers}
    | |  |
    {quantum}= {Quantum mechanics}

UPDATE: Okay so if anyone is interested, I just put up a pre-release of the book https://gum.co/noBSLA

euuuhhhh, are people really not been taught this? Eigenvalues/Eigenvector, space rotation, n-dimension dot product with missing dimensions, LU/RU system ...

How can people do computer science without this knowledge? It is needed for

1) solving equations;

2) transforming 3D vector and then projecting them on a 2D space (a screen) ;

3) text indexation;

4) resolving graph problem (like drawing a CPU based on declaring connection between "wires"...

5) and with dual space you access even more fire power (fourier, laplace...);

6) clustering db is homeomorph to a balanced n-partition of a graph;

Geometry is one of the most powerful tool prior to the knowledge of any programming language in computer industry.

You seem to be suffering from a variant of the Typical Mind Fallacy [1], let's call it... the Typical Programmer Fallacy. Most programmers never need to do things you mentioned, or anything else that requires linear algebra.

[1] http://lesswrong.com/lw/dr/generalizing_from_one_example/

She's talking about people doing computer science and not about what most programmers do.

This link IS people being taught this. Don't be so dismissive of people whose lives are slightly different from yours.

As the ultimate dunce of my former universities, and being scorn in computer related job for dismissing technologies I did not fully grasped, I am feeling a total empathy with people that would be hurt by my own words.

However, I do feel it is necessary: even a limited person can improve by taking all of the juice of the most simple basic maths. I cheat, I use real world out of computer science simple solution in my code.

I noticed the "good student/developers" who were talented tend to solve problem the complicated way. And they developed both an agility and comfort zone out of it. So that my way of helping them saying: «hey man, programming is not about writing code, it is about solving problem with well known, basic understanding.»

You could call it «revenge». I call it irony. And if my words could hurt someone, since I am way more moderated than what I had to cope with, then I consider I am nice.

I won't apologize.

PS I do AM surprised this knowledge is not a requirement for CS.

Like knowing non linear problem (like estimating the cost of resource "per user" on the cloud with all their "discrete" rough edged pricing plan) cannot be solved with linear algebra.

The cloud, the cost of IP transit (95th percentile) are defeating by nature any possible estimations. And still I see PhD and geniuses trying to code with linear equations the pricing based on the evolution of users. This is mathematically impossible.

So am I a genius, a dunce or what? I would be glad to know I am wrong, I would stop wondering.

Of course non-linear problems can be solved in linear algebra. This is the basis of the Gauss–Newton algorithm, which reduces a non-linear problem to the repeated solving of a linear system.

estimation through linear techniques don't work on non linear problems. Else, if I have X users, tell me the formula to guess my IP transit bill when the number of users double?

If you have it, you are a genius.

So am I a genius, a dunce or what?

A 14 yr old, it sounds like.

Don't let this irritate you. Arrogant, douchy HN comments are not worth stressing over.

If you study computer science, you will need to learn this. However I would guess that most or at least a lot of people here have never studied computer science at university level.

> are people really not been taught this? >

You make it sound like these are obvious thing, but most people have trouble even with basic math, let a long linear algebra.

I wrote a short 4-page tutorial on linear algebra myself and it covers quite a few things. Would love to hear what you think about it: http://minireference.com/blog/linear-algebra-tutorial/

The problem is not the content of any text. Btw, yours is good, but actually not for me, not because you are any bad, but because my brain is this way.

Knowledge don't magically jump into the brain of people.

The most important part about knowledge is not its raw packed informative formalism but the teaching, or letting enough leisure time for people to study by themselves. It is the art of letting people build their own intuitive understanding. No computer, no books, no things can replace a good teacher, or a a good book that suits you.

That is why we need to disagree ;) dialog, and then agree again. Because, maybe dialog is a way of learning.

Maybe this book will also be of interest: http://www.amazon.com/Practical-Linear-Algebra-Geometry-Tool...

I'm using this book as a companion to this edX course:


I used this book to pass my course in linear algebra at university, and it also helped me out tremendously in my computer graphics course.

Speaking of books, my No bullshit guide to linear algebra is almost ready. Anyone taking a linear algebra class with $15 in their pocket should sign up for the prerelease: http://bit.ly/18xPbs9 It starts from the basics and gets to quantum mechanics by then end ;) All in just 250pp on 5½[in] x 8½[in] paper.

You can check out a 4 page tutorial of material extracted from the book: http://minireference.com/blog/linear-algebra-tutorial/

I personally don't like how he uses the term "system" in place of "mapping" or "transformation," but they're some good notes

1. The section on matrices should start (not end) with the orthogonal length preserving co-ordinate transformation. It's the easiest geometry/trig connection, the elements can be worked out by hand, demonstrates unitary property, and makes a nice segue into inverse. 2. The section on linear system response could make a mention of Fourier series (even Fourier integral). 3. It's still a mystery to me why linear algebra is not taught using the Dirac notation.

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact