
Eigenvectors and Eigenvalues (2015) - vimalvnair
http://setosa.io/ev/eigenvectors-and-eigenvalues/
======
steamer25
3Blue1Brown has a good series on YouTube for building intuition in linear
algebra:

[https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)

In one of the last videos in the (relatively short) series, he discusses
eigen-*:

~'eigen-stuffs are straight-forward but only make sense if you have a solid
visual understanding of the pre-requisites (linear transformations,
determinants, linear systems of equations, change of basis, etc.). Confusion
about eigen-stuffs usually has more to do with a shaky foundation than the
eigen-things themselves'

[https://youtu.be/PFDu9oVAE-g](https://youtu.be/PFDu9oVAE-g)

All of the videos in the series, including this later one on eigen-things,
focus on animations to show what the number crunching is doing to the
coordinate system.

~~~
AceJohnny2
3Blue1Brown (Grant Sanderson) is really, _really_ good. I follow a number of
education channels on YouTube, and Grant blows them all out of the water for
the kind of insights, new perspectives, and inspiration he provides. His
animations are fantastically put together to clearly and unobtrusively
illustrate the point he's making. I also really like his voice, soothing,
clear and with enough intonation to avoid boredom, and perfect pace. I wish I
had his linear algebra series back in college, I suspect I would have done
much better.

He's the creator I support the most on Patreon:
[https://www.patreon.com/3blue1brown](https://www.patreon.com/3blue1brown)

~~~
picklesman
I’ve taken 3 college level courses on linear algebra and only really got an
intuition for eigenvectors/eigenvalues when watching his video series. Super
great!

~~~
emmelaich
I had problems with Principal Component Analysis; nice to see he's got one on
that as well!

[http://setosa.io/ev/principal-component-
analysis/](http://setosa.io/ev/principal-component-analysis/)

(edit: ugh, contains an ad for MacKeeper)

~~~
jjoonathan
What, you don't want to remove your mac from trash?

Incidentally, that particular grammatical quirk is one I've seen a number of
times. Is it a thing for people coming from Russian? Or just general
difficulty with remembering the order of the two subjects in two-subject
phrases?

------
xevb3k
Whenever this kind of stuff comes up I feel like a bit of a fraud...

I’ve written a bunch of scientific data analysis code. I have a science PhD.
Written large image analysis pipelines that worked as well as the state of the
art... been published etc.

For the most part I’ve found basic math and heuristics to be good enough.
Every so often I go relearn calculus. But honestly, none of this stuff ever
seems to come in handy. Maybe it’s because most of what I encounter is novel
datasets where there’s no established method?

I reasonably regularly pick up new discrete methods, but the numerical stuff
never seems super useful...

I don’t know, just a confession I guess... it never comes up on interviews
either for what it’s worth.

~~~
soVeryTired
For a large fraction of probability theory, you only need two main facts from
linear algebra.

First, linear transforms map spheres to ellipsoids. The axes of the ellipsoid
are the eigenvectors.

Second, linear transforms map (hyper) cubes to parallelpipeds. If you start
with a unit cube, the volume of the parallelpiped is the determinant of the
transform.

That more or less covers covariances, PCA, and change of variables. Whenever I
try to understand or re-derive a fact in probability, I almost always end up
back at one or the other fact.

They're also useful in multivariate calculus, which is really just stitched-
together linear algebra.

~~~
qmalzp
I think the first point is only true for symmetric matrices (which includes
those that show up in multivariable calc). In general, the eigenvectors need
not be orthogonal.

~~~
soVeryTired
Yep, you could well be right. The image of an ellipse under a linear transform
is definitely an ellipse, but I'm not sure about the eigenvectors in the
general case.

The symmetric case is by far the most relevant for probability theory though.

~~~
tprice7
In general it's the eigenvectors of the positive-semidefinite (hence
symmetric) part of the left polar decomposition.

------
lewis500
Interesting to see this back on the front page after three years. Still
remember us sitting in our living room drawing this on paper and arguing about
the right approaches.

Maybe one day vicapow and I will make a triumphant return to the explorables
space, but life has a way of getting in the way as you get older.

~~~
dang
That was
[https://news.ycombinator.com/item?id=8918259](https://news.ycombinator.com/item?id=8918259).

------
sannee
Eigen{vectors,values} seemed like this totally arbitrary concept when I first
learned about them. Later it turned out that they are actually really awesome
and pop up all the time.

Multivariable function extrema? Just look at the eigenvalues of the hessian.
Jacobi method convergence? Eigenvalues of the update matrix. RNN gradient
explosion? Of course, eigenvalues.

------
danlugo92
I highly recommend 1Blue1Brown's Essence of Linear Algebra series[0] to highly
grasp and comprehend linear algebra.

[0]
[https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)

~~~
olskool
This is truly a favorite of mine.

------
toppy
Classic paper on Google's PageRank: "The $25,000,000,000 eigenvector"

[https://www.rose-
hulman.edu/~bryan/googleFinalVersionFixed.p...](https://www.rose-
hulman.edu/~bryan/googleFinalVersionFixed.pdf)

------
WhompingWindows
Am I the only one with [Math Processing Error] all over this source? Ctrl+F
gives me 57 instances of that string

~~~
akvadrako
I see that on some pages. Try reloading.

------
asafira
Quantum mechanics should be listed as another reason to learn about
eigenvectors and eigenvalues! =)

------
deepdiving12
Beautiful demos/explanations. Would've been really handy back in school.

------
lr4444lr
Great interactive demos.

------
KasianFranks
There's nothing to see here.

~~~
abiox
what do you mean? this seems like a strange statement, given the context.

------
haskellandchill
The visual explanation movement falls flat for me. It's like trying to
understand Monads through blog posts. It's great if you already understand the
concept to develop your intuition, or if you've never heard of the concept to
pique your interest, but it won't help in the intermediate area where you know
what you want to know but don't understand it fully. I need to build proofs
through incremental exercises to grasp these concepts.

~~~
whatshisface
As someone who understands eigenfunctions already, I don't understand the
pictures either. Here is the best way to think about it: a matrix is a
transformation, a composition of rotation, scaling, etc. Eigensets are lines
going through the origin that the matrix moves points along. So a rotation
would have no eigenvectors because _none_ of the points move in a straight
line, while a scaling along the x axis would have an eigenset that was also
along the x axis, consisting of the points that were moved straight up or
down.

To imagine finding the eigenset, just ask, could I draw a line through 0,0
such that any point I put on it would stay on it after the matrix acted?

~~~
electricslpnsld
> So a rotation would have no eigenvectors

Rotations have eigenvectors: a 2D rotation has two complex eigenvectors, a 3D
rotation has one real and two complex eigenvectors, ...

~~~
whatshisface
That's a fair and true catch, but I can cover myself by pointing out that the
article was only talking about matrices in R^(m x n). ;)

