No CrossRef data available.
Published online by Cambridge University Press: 05 August 2021
This paper derives asymptotic risk (expected loss) results for shrinkage estimators with multidimensional regularization in high-dimensional settings. We introduce a class of multidimensional shrinkage estimators (MuSEs), which includes the elastic net, and show that—as the number of parameters to estimate grows—the empirical loss converges to the oracle-optimal risk. This result holds when the regularization parameters are estimated empirically via cross-validation or Stein’s unbiased risk estimate. To help guide applied researchers in their choice of estimator, we compare the empirical Bayes risk of the lasso, ridge, and elastic net in a spike and normal setting. Of the three estimators, we find that the elastic net performs best when the data are moderately sparse and the lasso performs best when the data are highly sparse. Our analysis suggests that applied researchers who are unsure about the level of sparsity in their data might benefit from using MuSEs such as the elastic net. We exploit these insights to propose a new estimator, the cubic net, and demonstrate through simulations that it outperforms the three other estimators for any sparsity level.
I thank Rachael Meager for invaluable guidance provided throughout the supervision of this paper at the London School of Economics, and Alberto Abadie, Maximilian Kasy, and Matthew Levy for giving insightful comments and feedback. I also thank David Card for starting me on this line of research. A co-editor of this journal and three anonymous referees provided very useful comments that contributed to improve this paper.