ADMM for High-Dimensional Sparse Penalized Quantile Regression

Yuwen Gu, Jun Fan, Lingchen Kong, Shiqian Ma, Hui Zou

Research output: Contribution to journalArticlepeer-review

73 Scopus citations

Abstract

Sparse penalized quantile regression is a useful tool for variable selection, robust estimation, and heteroscedasticity detection in high-dimensional data analysis. The computational issue of the sparse penalized quantile regression has not yet been fully resolved in the literature, due to nonsmoothness of the quantile regression loss function. We introduce fast alternating direction method of multipliers (ADMM) algorithms for computing the sparse penalized quantile regression. The convergence properties of the proposed algorithms are established. Numerical examples demonstrate the competitive performance of our algorithm: it significantly outperforms several other fast solvers for high-dimensional penalized quantile regression. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)319-331
Number of pages13
JournalTechnometrics
Volume60
Issue number3
DOIs
StatePublished - Jul 3 2018

Bibliographical note

Funding Information:
This work is supported in part by NSF grant DMS-1505111, the 111 Project of China (B16002), National Science Foundation of China grants 11431002 and 11671029, and the Hong Kong Research Grants Council General Research Fund (14205314).

Publisher Copyright:
© 2018, © 2018 American Statistical Association and the American Society for Quality. © 2018, © Yuwen Gu and Jun Fan.

Keywords

  • Alternating direction method of multipliers
  • Lasso
  • Nonconvex penalty
  • Quantile regression
  • Variable selection

Fingerprint

Dive into the research topics of 'ADMM for High-Dimensional Sparse Penalized Quantile Regression'. Together they form a unique fingerprint.

Cite this