Consistent and conservative model selection with the adaptive Lasso in stationary and nonstationary autoregressions

We show that the adaptive Lasso is oracle efficient in stationary and nonstationary autoregressions. This means that it estimates parameters consistently, selects the correct sparsity pattern, and estimates the coefficients belonging to the relevant variables at the same asymptotic efficiency as if...

Full description

Bibliographic Details
Main Author: Kock, A
Format: Journal article
Published: Cambridge University Press 2015
Description
Summary:We show that the adaptive Lasso is oracle efficient in stationary and nonstationary autoregressions. This means that it estimates parameters consistently, selects the correct sparsity pattern, and estimates the coefficients belonging to the relevant variables at the same asymptotic efficiency as if only these had been included in the model from the outset. In particular, this implies that it is able to discriminate between stationary and nonstationary autoregressions and it thereby constitutes an addition to the set of unit root tests. Next, and important in practice, we show that choosing the tuning parameter by Bayesian Information Criterion (BIC) results in consistent model selection. However, it is also shown that the adaptive Lasso has no power against shrinking alternatives of the form c/T if it is tuned to perform consistent model selection. We show that if the adaptive Lasso is tuned to perform conservative model selection it has power even against shrinking alternatives of this form and compare it to the plain Lasso.