A convex formulation for high-dimensional sparse sliced inverse regression

Kean Ming Tan, Zhaoran Wang, Tong Zhang, Han Liu, R. Dennis Cook

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Sliced inverse regression is a popular tool for sufficient dimension reduction, which replaces covariates with a minimal set of their linear combinations without loss of information on the conditional distribution of the response given the covariates. The estimated linear combinations include all covariates, making results difficult to interpret and perhaps unnecessarily variable, particularly when the number of covariates is large. In this paper, we propose a convex formulation for fitting sparse sliced inverse regression in high dimensions. Our proposal estimates the subspace of the linear combinations of the covariates directly and performs variable selection simultaneously.We solve the resulting convex optimization problem via the linearized alternating direction methods of multiplier algorithm, and establish an upper bound on the subspace distance between the estimated and the true subspaces. Through numerical studies, we show that our proposal is able to identify the correct covariates in the high-dimensional setting.

Original languageEnglish (US)
Pages (from-to)769-782
Number of pages14
JournalBiometrika
Volume105
Issue number4
DOIs
StatePublished - Dec 1 2018

Bibliographical note

Funding Information:
This work was partially supported by the National Science Foundation. We thank the editor, an associate editor, and three reviewers for their comments. We thank Lexin Li and Tao Wang for responding to our inquiries and providing the R code.

Publisher Copyright:
© 2018 Biometrika Trust.

Keywords

  • Convex optimization
  • Dimension reduction
  • Nonparametric regression
  • Principal fitted component.

Fingerprint

Dive into the research topics of 'A convex formulation for high-dimensional sparse sliced inverse regression'. Together they form a unique fingerprint.

Cite this