We propose a new class of generalized thresholding operators that combine thresholding with shrinkage, and study generalized thresholding of the sample covariance matrix in high dimensions. Generalized thresholding of the covariance matrix has good theoretical properties and carries almost no computational burden. We obtain an explicit convergence rate in the operator norm that shows the tradeoff between the sparsity of the true model, dimension, and the sample size, and shows that generalized thresholding is consistent over a large class of models as long as the dimension p and the sample size n satisfy log p/n → O. In addition, we show that generalized thresholding has the "sparsistency" property, meaning it estimates true zeros as zeros with probability tending to 1, and, under an additional mild condition, is sign consistent for nonzero elements. We show that generalized thresholding covers, as special cases, hard and soft thresholding, smoothly clipped absolute deviation, and adaptive lasso, and compare different types of generalized thresholding in a simulation study and in an example of gene clustering from a microarray experiment with tumor tissues.
Bibliographical noteFunding Information:
Adam J. Rothman is a Ph.D. candidate, Department of Statistics, University of Michigan, Ann Arbor, MI 48109-1107 (E-mail: email@example.com). Elizaveta Levina is Assistant Professor, Department of Statistics, University of Michigan, Ann Arbor, MI 48109-1107 (E-mail: firstname.lastname@example.org). Ji Zhu is Associate Professor Department of Statistics, University of Michigan, Ann Arbor, MI 48109-1107 (E-mail: email@example.com). Elizaveta Levina’s research is supported in part by grants from the National Science Foundation (NSF; DMS-0505424 and DMS-0805798). Ji Zhu’s research is supported in part by grants from the NSF (DMS-0505432 and DMS-0705532). The authors thank an Associate Editor and two referees for helpful suggestions
Copyright 2009 Elsevier B.V., All rights reserved.
- High-dimensional data