Open Access
August 2009 On the adaptive elastic-net with a diverging number of parameters
Hui Zou, Hao Helen Zhang
Ann. Statist. 37(4): 1733-1751 (August 2009). DOI: 10.1214/08-AOS625

Abstract

We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Amer. Statist. Assoc. 96 (2001) 1348–1360] and [Ann. Statist. 32 (2004) 928–961] which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.

Citation

Download Citation

Hui Zou. Hao Helen Zhang. "On the adaptive elastic-net with a diverging number of parameters." Ann. Statist. 37 (4) 1733 - 1751, August 2009. https://doi.org/10.1214/08-AOS625

Information

Published: August 2009
First available in Project Euclid: 18 June 2009

zbMATH: 1168.62064
MathSciNet: MR2533470
Digital Object Identifier: 10.1214/08-AOS625

Subjects:
Primary: 62J05
Secondary: 62J07

Keywords: Adaptive regularization , elastic-net , high dimensionality , Model selection , oracle property , shrinkage methods

Rights: Copyright © 2009 Institute of Mathematical Statistics

Vol.37 • No. 4 • August 2009
Back to Top