Hostname: page-component-cd9895bd7-dk4vv Total loading time: 0 Render date: 2024-12-27T17:59:11.911Z Has data issue: false hasContentIssue false

CONSISTENT AND CONSERVATIVE MODEL SELECTION WITH THE ADAPTIVE LASSO IN STATIONARY AND NONSTATIONARY AUTOREGRESSIONS

Published online by Cambridge University Press:  01 September 2015

Anders Bredahl Kock*
Affiliation:
Aarhus University and CREATES
*
*Address correspondence to Anders Bredahl Kock, Aarhus University and CREATES, Fuglesangs Alle 4, 8210 Aarhus V, Denmark. email: [email protected].

Abstract

We show that the adaptive Lasso is oracle efficient in stationary and nonstationary autoregressions. This means that it estimates parameters consistently, selects the correct sparsity pattern, and estimates the coefficients belonging to the relevant variables at the same asymptotic efficiency as if only these had been included in the model from the outset. In particular, this implies that it is able to discriminate between stationary and nonstationary autoregressions and it thereby constitutes an addition to the set of unit root tests. Next, and important in practice, we show that choosing the tuning parameter by Bayesian Information Criterion (BIC) results in consistent model selection.

However, it is also shown that the adaptive Lasso has no power against shrinking alternatives of the form c/T if it is tuned to perform consistent model selection. We show that if the adaptive Lasso is tuned to perform conservative model selection it has power even against shrinking alternatives of this form and compare it to the plain Lasso.

Type
ARTICLES
Copyright
Copyright © Cambridge University Press 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Belloni, A. & Chernozhukov, V. (2011) High dimensional sparse econometric models: An introduction. Inverse Problems and High-Dimensional Estimation 121156.Google Scholar
Belloni, A., Chernozhukov, V., & Hansen, C. (2014) Inference on treatment effects after selection among high-dimensional controls. The Review of Economic Studies 81(2), 608650.CrossRefGoogle Scholar
Bühlmann, P. & van de Geer, S. (2011) Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer-Verlag.CrossRefGoogle Scholar
Candes, E. & Tao, T. (2007) The dantzig selector: Statistical estimation when p is much larger than n. The Annals of Statistics 35, 23132351.Google Scholar
Caner, M. (2011) Selecting the correct number of factors in approximate factor models: The large panel case with bridge estimators. Technical report. Mimeo, North Carolina State University.Google Scholar
Caner, M. & Knight, K. (2013) An alternative to unit root tests: Bridge estimators differentiate between nonstationary versus stationary models and select optimal lag. Journal of Statistical Planning and Inference.CrossRefGoogle Scholar
Caner, M. & Kock, A.B. (2014) Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative lasso. arXiv preprint arXiv:1410.4208.Google Scholar
Fan, J. & Li, R. (2001) Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 96, 13481360.Google Scholar
Fan, J. & Lv, J. (2008) Sure independence screening for ultrahigh dimensional feature space. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 70, 849911.Google Scholar
Hall, P. & Heyde, C. (1980) Martingale limit theory and its application. Academic Press.Google Scholar
Hamilton, J.D. (1994) Time Series Analysis. Cambridge University Press.Google Scholar
Huang, J., Horowitz, J.L., & Ma, S. (2008) Asymptotic properties of bridge estimators in sparse high-dimensional regression models. The Annals of Statistics 36, 587613.Google Scholar
Knight, K. & Fu, W. (2000) Asymptotics for lasso-type estimators. Annals of Statistics 13561378.Google Scholar
Kock, A.B. & Callot, L. (2015) Oracle inequalities for high dimensional vector autoregressions. Journal of Econometrics 186(2), 325344.Google Scholar
Lehmann, E.L. & Casella, G. (1998) Theory of point estimation, vol. 31, Springer Verlag.Google Scholar
Liao, Z. & Phillips, P. (2013) Automated estimation of vector error correction models. Econometric Theory (forthcoming).Google Scholar
Medeiros, M. & Mendes, E. (2012) Estimating high-dimensional time series models. Technical report.Google Scholar
Ng, S. & Perron, P. (2001) Lag length selection and the construction of unit root tests with good size and power. Econometrica 69, 15191554.Google Scholar
Phillips, P.C. & Solo, V. (1992) Asymptotics for linear processes. The Annals of Statistics 9711001.Google Scholar
Phillips, P.C.B. (1987a) Time series regression with a unit root. Econometrica 277301.Google Scholar
Phillips, P.C.B. (1987b) Towards a unified asymptotic theory for autoregression. Biometrika 74, 535547.Google Scholar
Ploberger, W. & Phillips, P.C. (2003) Empirical limits for time series econometric models. Econometrica 71(2), 627673.Google Scholar
Pötscher, B. & Schneider, U. (2009) On the distribution of the adaptive lasso estimator. Journal of Statistical Planning and Inference 139(8), 27752790.Google Scholar
Ren, Y. & Zhang, X. (2010) Subset selection for vector autoregressive processes via adaptive lasso. Statistics & probability letters 80(23), 17051712.Google Scholar
Rissanen, J. (1986) Stochastic complexity and modeling. The Annals of Statistics 10801100.Google Scholar
Rissanen, J. (1987) Stochastic complexity. Journal of the Royal Statistical Society. Series B (Methodological) 223239.Google Scholar
Tibshirani, R. (1996) Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological) 267288.Google Scholar
van de Geer, S., Bühlmann, P., Ritov, Y., & Dezeure, R. (2014) On asymptotically optimal confidence regions and tests for high-dimensional models. The Annals of Statistics 42(3), 11661202.CrossRefGoogle Scholar
Wang, H., Li, G., & Tsai, C.L. (2007) Regression coefficient and autoregressive order shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 69, 6378.Google Scholar
Zou, H. (2006) The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101, 14181429.Google Scholar
Supplementary material: PDF

Anders Bredahl Kock supplementary material S1

Anders Bredahl Kock supplementary material

Download Anders Bredahl Kock supplementary material S1(PDF)
PDF 295.5 KB