Least Squares after Model Selection in High-Dimensional Sparse Models

Note: new title. Former title = Post-ℓ1-Penalized Estimators in High-Dimensional Linear Regression Models. First Version submitted March 29, 2010; Orig. date Jan 4, 2009; this revision June 14, 2011

Bibliographic Details
Main Authors: Belloni, Alexandre, Chernozhukov, Victor
Format: Working Paper
Language:English
Published: Cambridge, MA: Dept. of Economics, M.I.T. 2011
Subjects:
Online Access:http://hdl.handle.net/1721.1/65111
_version_ 1811075817079308288
author Belloni, Alexandre
Chernozhukov, Victor
author_facet Belloni, Alexandre
Chernozhukov, Victor
author_sort Belloni, Alexandre
collection MIT
description Note: new title. Former title = Post-ℓ1-Penalized Estimators in High-Dimensional Linear Regression Models. First Version submitted March 29, 2010; Orig. date Jan 4, 2009; this revision June 14, 2011
first_indexed 2024-09-23T10:12:20Z
format Working Paper
id mit-1721.1/65111
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T10:12:20Z
publishDate 2011
publisher Cambridge, MA: Dept. of Economics, M.I.T.
record_format dspace
spelling mit-1721.1/651112019-04-11T09:14:42Z Least Squares after Model Selection in High-Dimensional Sparse Models Belloni, Alexandre Chernozhukov, Victor Lasso. OLS Post Lasso, Post-Model-Selection Estimators Note: new title. Former title = Post-ℓ1-Penalized Estimators in High-Dimensional Linear Regression Models. First Version submitted March 29, 2010; Orig. date Jan 4, 2009; this revision June 14, 2011 In this paper we study post-model selection estimators which apply ordinary least squares (ols) to the model selected by first-step penalized estimators, typically lasso. It is well known that lasso can estimate the non-parametric regression function at nearly the oracle rate, and is thus hard to improve upon. We show that ols post lasso estimator performs at least as well as lasso in terms of the rate of convergence, and has the advantage of a smaller bias. Remarkably, this performance occurs even if the lasso-based model selection “fails” in the sense of missing some components of the “true” regression model. By the “true” model we mean here the best s-dimensional approximation to the nonparametric regression function chosen by the oracle. Furthermore, ols post lasso estimator can perform strictly better than lasso, in the sense of a strictly faster rate of convergence, if the lasso-based model selection correctly includes all components of the “true” model as a subset and also achieves sufficient sparsity. In the extreme case, when lasso perfectly selects the “true” model, the ols post lasso estimator becomes the oracle estimator. An important ingredient in our analysis is a new sparsity bound on the dimension of the model selected by lasso which guarantees that this dimension is at most of the same order as the dimension of the “true” model. Our rate results are non-asymptotic and hold in both parametric and nonparametric models. Moreover, our analysis is not limited to the lasso estimator acting as selector in the first step, but also applies to any other estimator, for example various forms of thresholded lasso, with good rates and good sparsity properties. Our analysis covers both traditional thresholding and a new practical, data-driven thresholding scheme that induces maximal sparsity subject to maintaining a certain goodness-of-fit. The latter scheme has theoretical guarantees similar to those of lasso or ols post lasso, but it dominates these procedures as well as traditional thresholding in a wide variety of experiments. 2011-08-11T18:08:10Z 2011-08-11T18:08:10Z 2009-01-04 Working Paper http://hdl.handle.net/1721.1/65111 en Working Paper (Massachusetts Institute of Technology, Department of Economics);10-05 An error occurred on the license name. An error occurred getting the license - uri. application/pdf Cambridge, MA: Dept. of Economics, M.I.T.
spellingShingle Lasso. OLS Post Lasso, Post-Model-Selection Estimators
Belloni, Alexandre
Chernozhukov, Victor
Least Squares after Model Selection in High-Dimensional Sparse Models
title Least Squares after Model Selection in High-Dimensional Sparse Models
title_full Least Squares after Model Selection in High-Dimensional Sparse Models
title_fullStr Least Squares after Model Selection in High-Dimensional Sparse Models
title_full_unstemmed Least Squares after Model Selection in High-Dimensional Sparse Models
title_short Least Squares after Model Selection in High-Dimensional Sparse Models
title_sort least squares after model selection in high dimensional sparse models
topic Lasso. OLS Post Lasso, Post-Model-Selection Estimators
url http://hdl.handle.net/1721.1/65111
work_keys_str_mv AT bellonialexandre leastsquaresaftermodelselectioninhighdimensionalsparsemodels
AT chernozhukovvictor leastsquaresaftermodelselectioninhighdimensionalsparsemodels