
A multilinear singular value decomposition (2000) [pdf] - cinquemb
http://citeseer.ist.psu.edu/viewdoc/download;jsessionid=1CADD9B520DD6B41383552BB7CAB0F2F?doi=10.1.1.102.9135&rep=rep1&type=pdf
======
ZenoArrow
I can't download the paper. Can anyone provide a quick summary?

~~~
andreyf
Abstract: We discuss a multilinear generalization of the singular value
decomposition. There is a strong analogy between several properties of the
matrix and the higher-order tensor decomposition; uniqueness, link with the
matrix eigenvalue decomposition, first-order perturbation effects, etc., are
analyzed. We investigate how tensor symmetries affect the decomposition and
propose a multilinear generalization of the symmetric eigenvalue decomposition
for pair-wise symmetric tensors.

~~~
ZenoArrow
Thanks andreyf.

I'm not familiar with most of those terms, but I'm guessing it's a big deal if
it's got a decent number of upvotes. Do these findings have any benefits that
a layperson might be able to understand (you can assume some programming
knowledge if that's relevant)?

~~~
lsorber
Attempt at a short summary:

A matrix M has an SVD U S V', where U and V have orthonormal columns and S is
a diagonal matrix. How does this generalize to multidimensional arrays?

A third-order tensor T has a multilinear SVD of the form S x (U, V, W), where
S is a core tensor and U, V, and W are again orthonormal matrices. In the
matrix SVD, U transforms the columns of S, while V transforms its rows (hence
it appears on the right of S). The product S x (U, V, W) is similar: U
transforms the columns of the core tensor S, V its rows and W the mode-3
vectors of S.

The authors show that slices of S must be orthogonal under the inner product,
e.g., <S(:,i,:), S(:,j,:)> = c * delta(i,j). In the case of a matrix SVD, this
reduces to S being diagonal.

An example application is predicting how a user would rate a movie at a given
time by approximating (the known entries of) a user x movie x time tensor with
a truncated multilinear SVD.

