Model selection procedure for high-dimensional data

Yongli Zhang, Xiaotong Shen

Research output: Contribution to journalArticlepeer-review

19 Scopus citations


For high-dimensional regression, the number of predictors may greatly exceed the sample size but only a small fraction of them are related to the response. Therefore, variable selection is inevitable, where consistent model selection is the primary concern. However, conventional consistent model selection criteria like Bayesian information criterion (BIC) may be inadequate due to their nonadaptivity to the model space and infeasibility of exhaustive search. To address these two issues, we establish a probability lower bound of selecting the smallest true model by an information criterion, based on which we propose a model selection criterion, what we call RICc, which adapts to the model space. Furthermore, we develop a computationally feasible method combining the computational power of least angle regression (LAR) with that of RICc. Both theoretical and simulation studies show that this method identifies the smallest true model with probability converging to one if the smallest true model is selected by LAR. The proposed method is applied to real data from the power market and outperforms the backward variable selection in terms of price forecasting accuracy.

Original languageEnglish (US)
Pages (from-to)350-358
Number of pages9
JournalStatistical Analysis and Data Mining
Issue number5
StatePublished - Oct 1 2010


  • Information criterion
  • Large p but small n
  • Model selection
  • Power market
  • RIC

Fingerprint Dive into the research topics of 'Model selection procedure for high-dimensional data'. Together they form a unique fingerprint.

Cite this