On optimal low rank Tucker approximation for tensors: the case for an adjustable core size

Bilian Chen, Zhening Li, Shuzhong Zhang

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


Approximating high order tensors by low Tucker-rank tensors have applications in psychometrics, chemometrics, computer vision, biomedical informatics, among others. Traditionally, solution methods for finding a low Tucker-rank approximation presume that the size of the core tensor is specified in advance, which may not be a realistic assumption in many applications. In this paper we propose a new computational model where the configuration and the size of the core become a part of the decisions to be optimized. Our approach is based on the so-called maximum block improvement method for non-convex block optimization. Numerical tests on various real data sets from gene expression analysis and image compression are reported, which show promising performances of the proposed algorithms.

Original languageEnglish (US)
Pages (from-to)811-832
Number of pages22
JournalJournal of Global Optimization
Issue number4
StatePublished - Aug 25 2015

Bibliographical note

Funding Information:
This work was partially supported by National Science Foundation of China (Grant 11301436 and 11371242), National Science Foundation of USA (Grant CMMI-1161242), Natural Science Foundation of Shanghai (Grant 12ZR1410100), and Ph.D. Programs Foundation of Chinese Ministry of Education (Grant 20123108120002). We would like to thank the anonymous referee for the insightful suggestions.

Publisher Copyright:
© 2014, Springer Science+Business Media New York.


  • Low-rank approximation
  • Maximum block improvement
  • Multiway array
  • Tucker decomposition


Dive into the research topics of 'On optimal low rank Tucker approximation for tensors: the case for an adjustable core size'. Together they form a unique fingerprint.

Cite this