Most model selection procedures use a fixed penalty penalizing an increase in the size of a model. These nonadaptive selection procedures perform well only in one type of situation. For instance, Bayesian information criterion (BIC) with a large penalty performs well for "small" models and poorly for "large" models, and Akaike's information criterion (AIC) does just the opposite. This article proposes an adaptive model selection procedure that uses a data-adaptive complexity penalty based on a concept of generalized degrees of freedom. The proposed procedure, combining the benefit of a class of nonadaptive procedures, approximates the best performance Of this class of procedures across a variety of different situations. This class includes many well-known procedures, such as AIC, BIC, Mallows's C p, and risk inflation criterion (RIC). The proposed procedure is applied to wavelet thresholding in nonparametric regression and variable selection in least squares regression. Simulation results and an asymptotic analysis support the effectiveness of the proposed procedure.
Bibliographical noteFunding Information:
XiaotongShenisAssociatePro,fDepartmentsesofoStatistics,r TheOhio StateUn,Coiluvmbes,OHur4s31i02(t-yEmilxshen: firstname.lastname@example.org)d. Jianming Ye isAsPsroo,cfSeainatRsstsDoesoertpofAccouarn-tmen tancy, Baruch Cllege,oCitUy noiNvfweYoerrkNs,1i0Yt00y(1E-mail: email@example.com)Sd.eshnrse’ewaasrspcuopehdrintpabryt National Science Foundation grant DMS-0072635 and OŽ ce foNvl aa Resegarant rN00c014h-99-1-0214. Ye gratefllyuacknowledges the Ž nancial support of the Zicklin School of Business, Baruch College. The authors also thank the associate editor and anonymous resfforeveryrhelpefulecomments and suggestions.
- Adaptive penalty
- False discovery rate
- Optimal predication
- Parametric and nonparametric regression
- Variable selection