Distributed sparse linear regression

Gonzalo Mateos, Juan Andrés Bazerque, Georgios B Giannakis

Research output: Contribution to journalArticlepeer-review

396 Scopus citations

Abstract

The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. A motivating application is explored in the context of wireless communications, whereby sensing cognitive radios collaborate to estimate the radio-frequency power spectrum density. Attaining different tradeoffs between complexity and convergence speed, three novel algorithms are obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. Interestingly, the per agent estimate updates are given by simple soft-thresholding operations, and inter-agent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments with both simulated and real data demonstrate the merits of the proposed distributed schemes, corroborating their convergence and global optimality. The ideas in this paper can be easily extended for the purpose of fitting related models in a distributed fashion, including the adaptive Lasso, elastic net, fused Lasso and nonnegative garrote.

Original languageEnglish (US)
Article number5499155
Pages (from-to)5262-5276
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume58
Issue number10
DOIs
StatePublished - Oct 2010

Bibliographical note

Funding Information:
Manuscript received January 29, 2010; accepted June 20, 2010. Date of publication July 01, 2010; date of current version September 15, 2010. The associate editor coordinating the review of this manuscript and approving it for publication was Prof. Huaiyu Dai. Work in this paper was supported by the NSF Grants CCF-0830480 and ECCS-0824007. Part of the paper was presented at the International Conference on Acoustics, Speech and Signal Processing, Dallas, TX, March 15-19, 2010.

Keywords

  • Distributed linear regression
  • Lasso
  • parallel optimization
  • sparse estimation

Fingerprint

Dive into the research topics of 'Distributed sparse linear regression'. Together they form a unique fingerprint.

Cite this