New operators for fixed-point theory: The sparsity-aware learning case

Konstantinos Slavakis, Yannis Kopsinis, Sergios Theodoridis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The present paper offers a link between fixed point theory and thresholding; one of the key enablers in sparsity-promoting algorithms, associated mostly with non-convex penalizing functions. A novel family of operators, the partially quasi-nonexpansive mappings, is introduced to provide the necessary theoretical foundations. Based on such fixed point theoretical ground, and motivated by hard thresholding, the generalized thresholding (GT) mapping is proposed that encompasses hard, soft, as well as recent advances of thresholding rules. GT is incorporated into an online/time-adaptive algorithm of linear complexity that demonstrates competitive performance with respect to computationally thirstier, state-of-the-art, RLS- and proportionate-type sparsity-aware methods.

Original languageEnglish (US)
Title of host publication2013 Proceedings of the 21st European Signal Processing Conference, EUSIPCO 2013
PublisherEuropean Signal Processing Conference, EUSIPCO
ISBN (Print)9780992862602
StatePublished - Jan 1 2013
Event2013 21st European Signal Processing Conference, EUSIPCO 2013 - Marrakech, Morocco
Duration: Sep 9 2013Sep 13 2013

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491

Other

Other2013 21st European Signal Processing Conference, EUSIPCO 2013
CountryMorocco
CityMarrakech
Period9/9/139/13/13

Keywords

  • Thresholding
  • adaptive filtering
  • fixed point theory
  • sparsity

Fingerprint Dive into the research topics of 'New operators for fixed-point theory: The sparsity-aware learning case'. Together they form a unique fingerprint.

Cite this