On the tensor svd and the optimal low rank orthogonal approximation of tensors

Jie Chen, Yousef Saad

Research output: Contribution to journalArticlepeer-review

52 Scopus citations

Abstract

It is known that a higher order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the existence of the tensor SVD with respect to the higher order SVD (HOSVD). In the face of these difficulties to generalize standard results known in the matrix case to tensors, we consider the low rank orthogonal approximation of tensors. The existence of an optimal approximation is theoretically guaranteed under certain conditions, and this optimal approximation yields a tensor decomposition where the diagonal of the core is maximized. We present an algorithm to compute this approximation and analyze its convergence behavior. Numerical experiments indicate a linear convergence rate for this algorithm.

Original languageEnglish (US)
Pages (from-to)1709-1734
Number of pages26
JournalSIAM Journal on Matrix Analysis and Applications
Volume30
Issue number4
DOIs
StatePublished - 2008

Keywords

  • Low rank Approximation
  • Multilinear algebra
  • Singular value decomposition
  • Tensor decomposition

Fingerprint

Dive into the research topics of 'On the tensor svd and the optimal low rank orthogonal approximation of tensors'. Together they form a unique fingerprint.

Cite this