Hacker News new | past | comments | ask | show | jobs | submit login

Don’t know how simple you want me to go, so apologies if required.

You’ve got a matrix A, ie a 2D array of numbers.

You’ve got a vector v, that is a 1D array of numbers.

You can multiply A by v, written Av, to get a new vector.

Matrices can therefore be viewed as devices for taking a vector and giving a new one, ie as linear transformations. You can design a matrix to rotate vectors, stretch them etc.

An eigenvector of a matrix is a vector that changes only by a constant factor c, when you apply (multiply by) the matrix. So Av = kv where k is a number. For example, if k=2 then applying A just doubles every number in v. Eigenvectors are stretched or squashed by the matrix, no funny business like shearing or rotating.

An eigenvalue is the “k” mentioned above, ie there’s at least one eigenvector such that Av=kv. There could be multiple eigenvectors all with the same k, ie all changed in the same simple way by applying the matrix.

Ok? Slight complication: matrix A may contain complex numbers, not just your common garden variety real numbers. But the above all still applies.

“Symmetric matrices” are matrices that have a symmetry across the top left to bottom right diagonal. So the number at A[i][j] is the same number as that at A[j][i].

“Hermitian” matrices are the equivalent of symmetric matrices when we’re dealing in complex numbers. Rather than A[i][j] being equal to A[j][i], if one entry is a complex number then the other entry is the “complex conjugate” of the first entry.

(In case you’ve forgotten complex numbers: we extend the real numbers to including numbers of the form r + zi, where r is the “real part” (just a real number) and “zi” is a real number z multiplied by i, where i = sqrt(-1). Imaginary numbers were introduced to allow us to solve equations we couldn’t otherwise solve, but have turned out to be super useful everywhere. The “complex conjugate” of an imaginary number is obtained by flipping the sign of z, eg (2 + 3i) -> (2 - 3i).)

Anyway, so Hermitian matrices are “symmetric” in this way and turn out to be super useful in physics. They also have the property that when you multiply a Hermitian matrix H by a vector v, the possible eigenvalues are real numbers, despite the fact that H may contain complex numbers.

What the paper states is a relationship between the eigenvalues and eigenvectors of a Hermitian matrix H and the eigenvalues of a smaller “submatrix” matrix H’, where H’ is just the same as H but with a column and row removed, where the index of the row and column is the same. This is neat, because it can be used to speed up the calculation of eigenvectors (as important as say array sorting in CS) in some specific situations. Commenting further is beyond me.




Thanks! It's been 15 years since I did any linear algebra or had any use for eigenvectors or matrices, and I could follow this. Much appreciated.


ELI(Bach of Comp Sci) went very well




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: