Improved iteratively reweighted least squares for unconstrained smoothed ℓq minimization

Ming Jun Lai, Yangyang Xu, Wotao Yin

Research output: Contribution to journalArticlepeer-review

194 Scopus citations

Abstract

In this paper, we first study ℓq minimization and its associated iterative reweighted algorithm for recovering sparse vectors. Unlike most existing work, we focus on unconstrained ℓq minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. Inspired by the results in [Daubechies et al., Comm. Pure Appl. Math., 63 (2010), pp. 1-38] for constrained ℓq minimization, we start with a preliminary yet novel analysis for unconstrained ℓq minimization, which includes convergence, error bound, and local convergence behavior. Then, the algorithm and analysis are extended to the recovery of low-rank matrices. The algorithms for both vector and matrix recovery have been compared to some state-of-the-art algorithms and show superior performance on recovering sparse vectors and low-rank matrices.

Original languageEnglish (US)
Pages (from-to)927-957
Number of pages31
JournalSIAM Journal on Numerical Analysis
Volume51
Issue number2
DOIs
StatePublished - 2013

Keywords

  • Compressed sensing
  • Iterative reweighted least squares
  • Low-rank matrix recovery
  • Matrix completion
  • Sparse optimization
  • Sparse vector recovery
  • ℓ minimization

Fingerprint Dive into the research topics of 'Improved iteratively reweighted least squares for unconstrained smoothed ℓ<sub>q</sub> minimization'. Together they form a unique fingerprint.

Cite this