Tensor Decomposition for Signal Processing and Machine Learning

Nicholas D. Sidiropoulos, Lieven De Lathauwer, Xiao Fu, Kejun Huang, Evangelos E. Papalexakis, Christos Faloutsos

Research output: Contribution to journalArticlepeer-review

996 Scopus citations

Abstract

Tensors or multiway arrays are functions of three or more indices (i,j,k,⋯)-similar to matrices (two-way arrays), which are functions of two indices (r,c) for (row, column). Tensors have a rich history, stretching over almost a century, and touching upon numerous disciplines; but they have only recently become ubiquitous in signal and data analytics at the confluence of signal processing, statistics, data mining, and machine learning. This overview article aims to provide a good starting point for researchers and practitioners interested in learning about and working with tensors. As such, it focuses on fundamentals and motivation (using various application examples), aiming to strike an appropriate balance of breadth and depth that will enable someone having taken first graduate courses in matrix algebra and probability to get started doing research and/or developing tensor algorithms and software. Some background in applied optimization is useful but not strictly required. The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties (including fairly good coverage of identifiability); broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.

Original languageEnglish (US)
Article number7891546
Pages (from-to)3551-3582
Number of pages32
JournalIEEE Transactions on Signal Processing
Volume65
Issue number13
DOIs
StatePublished - Jul 1 2017

Bibliographical note

Publisher Copyright:
© 1991-2012 IEEE.

Keywords

  • Cramér-Rao bound
  • Gauss-Newton
  • NP-hard problems
  • Tensor decomposition
  • Tucker model
  • alternating direction method of multipliers
  • alternating optimization
  • canonical polyadic decomposition (CPD)
  • classification
  • collaborative filtering
  • communications
  • gradient descent
  • harmonic retrieval
  • higher-order singular value decomposition (HOSVD)
  • mixture modeling
  • multilinear singular value decomposition (MLSVD)
  • parallel factor analysis (PARAFAC)
  • rank
  • source separation
  • speech separation
  • stochastic gradient
  • subspace learning
  • tensor factorization
  • topic modeling
  • uniqueness

Fingerprint

Dive into the research topics of 'Tensor Decomposition for Signal Processing and Machine Learning'. Together they form a unique fingerprint.

Cite this