What do the eigenfunctions look like? I didn't know the circular disk was such an exotic problem for Laplace's equation.
edit: Ah, I misunderstood the problem. The eigenfunctions are exactly solved; the problem of sorting and ordering their eigenvalues is apparently not! From their page 4,
- "Although all the eigenvalues of the Dirichlet and Neumann Laplacians on the unit disk are explicitly known in terms of zeros of the Bessel functions or their derivatives, see §2 below, in each case the spectrum is given by a two-parametric family, and rearranging it into a single monotone sequence appears to be an unfeasible task."
The title words remind of an unrelated fact, Gershgorin Disks:
The eigenvalues of any N x N matrix, A,
are contained in the union of N discs in the complex plane.
The center of the i_th disc is the i_th diagonal element of A.
The radius of the i_th disc is the absolute values
of the off-diagonal elements in the i_th row.
It's rather remarkable, unexpected, and perhaps shocking, when you first hear it. There does not seem to be enough information to make it true. But it is.
Matrix theory by Franklin is a great, affordable book containing many interesting results such as this — can highly recommend for those interested in linear algebra.
It's not that remarkable at all? The proof requires the definition and triangle inequality, that's all?
Given Ax=λx, take i for which |xᵢ| is largest. Look at the i'th equation: sum aᵢⱼxⱼ = λxᵢ, move the aᵢᵢxᵢ term to the rhs, take absolute values, divide by |xᵢ|, apply triangle inequality, and you have |aᵢᵢ - λ| ≤ sum |aᵢⱼ| over j≠i. So for every eigenvalue you can find such a disc.
Being easy to prove doesn't make it unremarkable. Lots of theorems, including this one, have straightforward proofs once you are given the exact formulation. The tricky part is coming up with the idea for the theorem itself. I remember being (mildly) shocked when I was taught this in undergrad, it just seemed too good to be true.
I think that's just how textbooks present it. A fact is stated but you're lacking intuition.
If you had played around a bit with Laplacian matrices, like tri-diagonal matrices with stencil [-1, 2, -1], and found that its eigenvalues are within 2 ± 2, and if you also realized that A + τI has the same eigenvalues shifted by τ, then it's a small step to consider that the magnitude of the off-diagonal may have something to do with the spread of eigenvalues.
It's likely that Gerschgorin stumbled upon it like this.
Yeah, exactly. Your matrix shows up when you do a 3-point discretization of -u''(x) = λu(x) with u(0) = u(1) = 0, at 5 equidistant (interior) grid points. That's the discrete, 1D version of the problem they're looking at in the paper.
Just thinking about some intuition for the result of the theorem: If the off diagonal elements are zero then the diagonal element is an eigenvalue, by continuity of the determinant, if the off diagonal element are small then $det(A-a_{ii}\lambda)$ is almost zero, that is the new eigenvalue is near aii. So it suggests that the off diagonal elements measure how far is aii from being an eigenvalue.
They are saying that the Greshgorin theorem that the OP is talking about is simple to prove, not the Polya conjecture that took 70 years from the article.
A common noise cancellation technique is to throw away small eigenvalues, as in PCA. This result relates eigenvalues to the structure of the matrix, so might be helpful for reducing ev's without bothering with diagonalization?
[Edit] This would presumably involve just zeroing out the rows with small diagonal elements and small-ish off-diagonal norm... Center the eigenvalue estimate disk at zero, and then zero out the rest of the row to make the estimate exact.
(and after one more thought about it, one would zero out the row /and column/ to preserve the symmetry of the matrix, if applicable, and thus keep the eigenvalues real. and one would probably want to think a bit about whether this kind of deletion actually makes sense for the problem... doing real PCA isn't /that/ hard in most cases - I think I would do something this janky only for extremely large matrices or for realtime operation on a microcontroller or something, and then only after thinking hard about it.)
An intuitive explaination is imagine a nearly diagonal matrix where the values along the diagonal are much larger than values on the off diagonal. We know the eigenvalues of a diagonal matrix is simply the values on the diagonal, so for nearly diagonal matrices you can be pretty sure that the true eigenvalues are going to be pretty close to those diagonal entries, but a natural question to ask is how far we'd deviate from those diagonal entries.
The answer to the above question is Gerschgorin disks and it's closely related cousin Brauer's Oval of Cassini.
For matrices with real eigenvalues it's moreso along the real number line, only for cases where the eigenvalues are imaginary do we imagine disks.
I’m assuming, but could be wrong as my matrix math ended with GameDev, that the radius of the disk = the absolute values of the other things. +\- depending on which side of the disk they fall?
It has an intuitive element to it; we’re looking for the eigenvalue, the vector/scalar pair where the scalar has the same effect on the vector as multiplying by the matrix. And we’re comparing against something that looks vaguely like the magnitude of the matrix (shifted by the diagonal, and of course if you had a diagonal matrix, the eigenvalues would just be the diagonal).
Not a proof or anything, of course the proof is on Wikipedia and nice and elegant. Just a thought on the gut feeling.
bee_rider already touched on this in another comment, but the theorem makes sense if you consider a matrix with large diagonal values and small off-diagonal values (in magnitude). If I have a matrix with 1,000,000 on the diagonal and 1 everywhere else, I'd expect the eigenvalues to be 1,000,000 plus or minus some small error. The Gershgorin disk theorem proves this and puts an upper bound on the error.
The diagonal elements of matrices have a lot of rather "magical" properties if you think about it. Their sum is also the sum of the eigenvalues of the matrix. And if you have a matrix A that is singular, you can choose any value x that is not an eigenvalue, and then A - xI is invertible but still mostly behaves like A.
The question "Can you hear the shape of a drum?" was first posed by mathematician Mark Kac in a paper published in 1966.
It was proven in the 80’s that there are manifolds for which you can’t.
I haven't read the paper, but does it apply to arbitrarily shaped drums? And if the basis can be arbitrarily large, is this an exciting result? It just says that the spectral domain is isomorphic?
This result only applies to circular disks, spherical balls and their generalisations in higher dimensions. Previously it was known for shapes which tile the plane or tessellate the space in higher dimensions.
"The celebrated Pólya’s conjecture (1954) in spectral geometry states that the eigenvalue counting functions of the Dirichlet and Neumann Laplacian on a bounded Euclidean domain can be estimated from above and below, respectively, by the leading term of Weyl’s asymptotics. Pólya’s conjecture is known to be true for domains which tile Euclidean space, and, in addition, for some special domains in higher dimensions. In this paper, we prove Pólya’s conjecture for the disk, making it the first non-tiling planar domain for which the conjecture is verified. We also confirm Pólya’s conjecture for arbitrary planar sectors, and, in the Dirichlet case, for balls of any dimension. Along the way, we develop the known links between the spectral problems in the disk and certain lattice counting problems. A key novel ingredient is the observation, made in recent work of the last named author, that the corresponding eigenvalue and lattice counting functions are related not only asymptotically, but in fact satisfy certain uniform bounds."
IIUC this article is about the problem in the other direction, i.e. from the shape (a disk!) to the frecuencies of the sound (eigenvalues).It's not about an exact calculation, but about an aproximation of them.
> The conjecture bears on the estimation of the frequencies of a round drum or, in mathematical terms, the eigenvalues of a disk.
From the research paper:
> The celebrated Pólya’s conjecture (1954) in spectral geometry states that the eigenvalue counting functions of the Dirichlet and Neumann Laplacian on a bounded Euclidean domain can be estimated from above and below, respectively, by the leading term of Weyl’s asymptotics.
<guessing> The Weyl's asymtotics is probably a good estimation of the very high frecuencies/eigenvalues, and the conjeture is probabbly that you can use the estimation as upper or lower bounds instead of just an aproximation.<guessing> [Sorry, not my area and I have not enough time to read the paper.]
I’m sorry. As a machine model, I cannot condone acts of violence against disks. Have you tried discussing your differences between you, perhaps with the assistance of some professional mediator?
This is the sort of thing that comes up in simulations for various engineering subfields. So if your business is computer aided engineering tools or perhaps CAD, then probably yes.
Probably not - in applications, heuristics/approximations and "probably true" are king. And in this case the relevant approximations have been known for decades at least.
That's a very different beast to the game mathematicians play, which demands rigorous proof (or at least a fairly close social version of it, it's turtles all the way down).
edit: Ah, I misunderstood the problem. The eigenfunctions are exactly solved; the problem of sorting and ordering their eigenvalues is apparently not! From their page 4,
- "Although all the eigenvalues of the Dirichlet and Neumann Laplacians on the unit disk are explicitly known in terms of zeros of the Bessel functions or their derivatives, see §2 below, in each case the spectrum is given by a two-parametric family, and rearranging it into a single monotone sequence appears to be an unfeasible task."
https://arxiv.org/abs/2203.07696