

What are eigen values? - apr
http://www.physlink.com/Education/askExperts/ae520.cfm

======
ntownsend
That was an excellent explanation of the intuition and motivation behind
eigenvectors.

Eigenvectors are also used extensively in pattern recognition. Notably, in
facial recognition: <http://en.wikipedia.org/wiki/Eigenface>

~~~
fburnaby
And it's for the same reason as stated in the article.

------
baddox
After reading the article, I have but one question: What are eigen values?

A quick lookup on Wikipedia reveals that they are used in linear algebra and,
for example, matrix transformations. I think this article failed by trying to
relate eigenvalues to everyday scenarios, similar to when beginning calculus
books give examples involving total pressure on an underwater dam door or mass
of a rod as examples for the usefulness of definite integrals. In those cases,
you end up calculating the area under the curve, but the curve happens to be
the graph of a constant function or a first-order polynomial. Calculating such
areas can easily be done by using the elementary formula for the area of a
trapezoid.

~~~
antiform
In linear algebra, for a given linear transformation (a certain kind of matrix
which generally represents some operation) M, an eigenvalue of M is a scalar c
such that given a non-zero vector v (called the eigenvector), Mv = cv, where
multiplication on the left is matrix multiplication and multiplication on the
right is multiplication of v by the scalar c.

The definition is significant because it says that for a certain vector v, the
transformation M, no matter how complicated it may be, just scales v by a
factor of c. This is useful, for instance, if you want to determine an axis by
which to evaluate the range of the transformation, because by choosing an
eigenvector, you are choosing a "simple" or "natural" perspective from which
to evaluate the range.

~~~
Novash
Look, I am good at Math. I even love Number Theory. But what you wrote scares
me. Can I run away now?

(I hope to one day be able to look at it and say 'my, that is so simple...'
like I do with high school math)

~~~
Oompa
This is the problem I had with Linear Algebra. The first half of the class
felt like I was just being drilled definitions. But once all the definitions
click, it's rather simple.

~~~
maggie
Honestly Linear Algebra is the next time you have to do a 'jump' in the
regular sequence of math.

You hit algebra, new concepts, you have to jump a little. Same with calculus.
Linear algebra is that next time.

~~~
eru
And it gets easier. Linear Algebra is no more complicated than calculus ---
rather less so. But it tends to be more abstract.

------
chrischen
Ahh. The Internet: teaching me what school failed to teach

------
pkulak
It depresses me a bit that in college I knew exactly what all this was... for
a month... until the day after I was tested on it.

~~~
jordyhoyt
The thing is, at least for me, we learned these concepts in college, but it
was only in terms of getting the right answer on the test. They only provided
the abstraction, not its basis in reality.

This would have been very helpful to have a grasp of for my research in
university where I had to explain eigenvalues and eigenvectors to fellow
computer science people. It probably also would have helped me understand what
the hell my own code did.

------
srn
Wow, that's the best explanation I've read. Very intuitive. Not to say I
didn't know the original definition at some point but having such a concept
tied to a concrete example makes it much easier to remember.

It makes me wish that most mathematical concepts were explained with some kind
of real world analogy.

------
yan
This is ridiculous. I was just thinking last week that even though I took
linear and did fairly well I never really grasped eigenvalues and eigenvectors
since I see them coming up fairly frequently as far as math concepts go and
don't instantly have an intuitive grasp. Literally the next week I see this
posted on hn.

------
diiq
I first encountered eigenmagic in machine learning --- we were interested in
the eigenvectors of the adjacency matrix of a graph. When the space that the
matrix lives in is so abstract, 'real world' examples don't make it any easier
to visualize what's happening.

If you really want to understand them intuitively, I reccomend plotting a few
hundred matrices and their respective eigen- values and vectors. When you feel
like you can predict what the results will look like at each frequency, _then_
the stories about rubber bands and shiny coins will make sense.

Or maybe you're faster, and the stories helped you make the leap --- in which
case, ignore me!

------
fburnaby
That was a very nice explanation. I love the rubber band analogy. But one
minor qualm: A transformation, when applied, typically has several eigenvalues
and eigenvectors associated with it. The "first" (usually selected as the
largest, or the "principle component") eigenvector and it's associated
eigenvalue is really being discussed. There are (or at least, can be) more of
them! We could think about stretching the elastic along its width as well.

------
proee
I still remember the following question on my linear algebra test in college:

"Prove that A and A-Inverse have the same Eigen Values and corresponding Eigen
Vectors."

The solution to this made no sense, but I managed to memorize it so that I
could pass the test.

Forest through the trees?

~~~
jibiki
It's not quite true, is it...?

    
    
      Av = cv        v is an eigenvector, c the corresponding eigenvalue
      (A^-1)Av = (A^-1)cv
      v = c(A^-1)v
      (1/c)v = (A^-1)v
    

So if c is an eigenvalue of A, then 1/c is an eigenvalue of (A^-1). (c can't
be 0 because A is invertible, I think.)

------
bonsaitree
If ever there was a time when I wish every browser had the MathML plug-in...
_sigh_.

------
swombat
This is a pretty good explanation for an essentially mathematical/abstract
concept.

Another name for eigenvectors is "identity" vectors, iirc (it's been a few
years...).

~~~
cracki
"identity vectors"... do you perhaps mean "unit vectors", of length 1?

~~~
bonsaitree
Yes.

For a linear system (of any kind of equivalence) within an N-dimensional
vector space, the Eigenvalues represent the scaling factors across those
dimensions when the system's state is represented by an NxN sparse-
diagonalized matrix (i.e. all values are 0 except for the main diagonal).

Those non-zero values along the main diagonal are its Eigenvalues and its rows
are Eigenvectors.

For the common 3D isometric (e.g. xyz) coordinate system, the Eigenvalues can
be thought of as a kind of multiplier across the unit vectors (Eigenvectors)
[[1,0,0][0,1,0][0,0,1]]. This is the "stretching" analog mentioned in the
article.

FWIW, Eigenvalues are not just a salient property of linear systems (i.e.
matrices), but also of higher-order tensors.

Finding (or more-often approximating) these "characteristic scaling states" is
a critical step in numerical analysis in everything from quantum mechanics,
financial hedging strategies, and even consumer product marketing plans.

If you've ever represented a system as a series of Markov probability chains,
every row of the "convergent/dominant" (if any) state contains an Eigenvalue.

~~~
gjm11
I'm afraid there are several errors in that.

1\. A linear mapping is not a "kind of equivalence" by any reasonable
definition. For instance, the function that maps every vector to 0 is a linear
mapping, and it has plenty of eigenvectors. (All with eigenvalue 0.)

2\. The eigenvectors are not the rows of the diagonalized matrix. They are the
rows (or columns, depending on just how you define things) of the matrix that
does the coordinate transformation to diagonalize your matrix.

3\. I think the paragraph beginning "For the common 3D isometric ..." is
rather confused. I certainly am when reading it. Perhaps the problem is in my
brain; what exactly do you mean? (Here's the nearest true thing I can think of
to what that paragraph says: For many, but not all, linear transformations
from a space to itself, there is an orthogonal coordinate system with respect
to which the transformation's matrix is diagonal; then the eigenvectors are
the axes of that coordinate system, and the eigenvalues are the amounts by
which vectors along those axes get stretched.)

4\. Tensors are just as linear as matrices. (You can do nonlinear things with
a tensor, but then you can with a matrix too.)

5\. The probabilities in a Markov chain's stationary state are not
eigenvalues. (Well, I'm sure they're eigenvalues of _something_ ; any set of
numbers can be made the eigenvalues of _something_ ; but they aren't, e.g.,
eigenvalues of the transition matrix.) What you may be thinking of is: a
stationary state of a Markov chain is an eigen _vector_ of the transition
matrix, with eigenvalue 1; all eigenvalues have absolute value at most 1; if
the Markov chain is ergodic (i.e., can get from any state to any other), then
there is exactly one stationary state, exactly one eigenvector of eigenvalue
1, and all the other eigenvalues are strictly smaller. This is enough to
guarantee convergence to the stationary state.

Also: If eigenvalues are as important in dealing with higher-order tensors as
they are for the second-order case (i.e., matrices) then that's news to me.
Tell me more?

~~~
eru
> 1\. A linear mapping is not a "kind of equivalence" by any reasonable
> definition. For instance, the function that maps every vector to 0 is a
> linear mapping, and it has plenty of eigenvectors. (All with eigenvalue 0.)

Linear mappings are the homomorphisms between vector spaces. So they are a
"kind of equivalence".

See for example
[http://en.wikibooks.org/wiki/Linear_Algebra/Definition_of_Ho...](http://en.wikibooks.org/wiki/Linear_Algebra/Definition_of_Homomorphism)

~~~
gjm11
The zero mapping is a homomorphism. I would not call it a "kind of
equivalence"; would you?

~~~
eru
It's a special (and quite trivial) "kind of equivalence".

See <http://en.wikipedia.org/wiki/Equivalence_class>

~~~
gjm11
I know what an equivalence class is, thank you. How does that make the map
from (say) R^3 to itself that sends everything to 0 a "kind of equivalence"?

It gives rise to an equivalence relation on R^3 in a fairly natural way, as
indeed any function gives rise to an equivalence relation on its domain: x~y
iff f(x)=f(y). But that doesn't mean that it _is_ a "kind of equivalence".

Now, obviously, "kind of" is vague enough that saying "an endomorphism of a
vector space simply Is Not a 'kind of equivalence'" would be too strong. But I
would like to know what, exactly, you mean by calling something a "kind of
equivalence", because I'm unable to think of any meaning for that phrase that
(1) seems sensible to me and (2) implies that endomorphisms of vector spaces
are "kinds of equivalence".

(Looking back at what bonsaitree wrote, I see s/he said "of any kind of
equivalence", which I unconsciously typo-corrected to "or any kind of
equivalence". But perhaps I misunderstood and bonsaitree meant something else,
though I can't think what it might be. bt, if you're reading this: my
apologies if I misunderstood, and would you care to clarify if so?)

~~~
eru
Yes, any function gives rise to equivalence classes. And linear functions give
rise to equivalence classes that preserve structure in vector spaces.

(I guess our discourse has reached its end of usefulness here.)

~~~
bonsaitree
Yes. Thanks for the correct re-wording as "or" and "equivalence class".

------
RK
I'll put my vote in for this being a poor explanation of what eigenvalues are.

------
elai
Linear algebra felt like the same thing stated in 5 different ways.

