High-dimensional regression/classification is challenging due to the curse of dimensionality. Lasso  and its various extensions , which can simultaneously perform feature selection and regression/classification, have received increasing attention in this situation. However, in the presence of highly correlated features lasso tends to only select one of those features resulting in suboptimal performance . Several methods have been proposed to address this issue in the literature. Shen and Ye  introduce an adaptive model selection procedure that corrects the estimation bias through a data-driven penalty based on generalized degrees of freedom.