Pivotal estimation via square-root Lasso in nonparametric regression
We propose a self-tuning √Lasso method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly behaved desig...
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | en_US |
Published: |
Institute of Mathematical Statistics
2015
|
Online Access: | http://hdl.handle.net/1721.1/93187 https://orcid.org/0000-0003-3582-8898 https://orcid.org/0000-0002-3250-6714 |
_version_ | 1811074307109945344 |
---|---|
author | Belloni, Alexandre Wang, Lie Chernozhukov, Victor V. |
author2 | Massachusetts Institute of Technology. Department of Economics |
author_facet | Massachusetts Institute of Technology. Department of Economics Belloni, Alexandre Wang, Lie Chernozhukov, Victor V. |
author_sort | Belloni, Alexandre |
collection | MIT |
description | We propose a self-tuning √Lasso method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly behaved designs, for example, perfectly collinear regressors, and generates sharp bounds even in extreme cases, such as the infinite variance case and the noiseless case, in contrast to Lasso. We establish various nonasymptotic bounds for√Lasso including prediction norm rate and sparsity. Our analysis is based on new impact factors that are tailored for bounding prediction norm. In order to cover heteroscedastic non-Gaussian noise, we rely on moderate deviation theory for self-normalized sums to achieve Gaussian-like results under weak conditions. Moreover, we derive bounds on the performance of ordinary least square (ols) applied to the model selected by √Lasso accounting for possible misspecification of the selected model. Under mild conditions, the rate of convergence of ols post √Lasso is as good as √Lasso’s rate. As an application, we consider the use of √Lasso and ols post √Lasso as estimators of nuisance parameters in a generic semiparametric problem (nonlinear moment condition or Z-problem), resulting in a construction of √n-consistent and asymptotically normal estimators of the main parameters. |
first_indexed | 2024-09-23T09:46:59Z |
format | Article |
id | mit-1721.1/93187 |
institution | Massachusetts Institute of Technology |
language | en_US |
last_indexed | 2024-09-23T09:46:59Z |
publishDate | 2015 |
publisher | Institute of Mathematical Statistics |
record_format | dspace |
spelling | mit-1721.1/931872022-09-30T16:48:34Z Pivotal estimation via square-root Lasso in nonparametric regression Belloni, Alexandre Wang, Lie Chernozhukov, Victor V. Massachusetts Institute of Technology. Department of Economics Massachusetts Institute of Technology. Department of Mathematics Wang, Lie Chernozhukov, Victor V. We propose a self-tuning √Lasso method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly behaved designs, for example, perfectly collinear regressors, and generates sharp bounds even in extreme cases, such as the infinite variance case and the noiseless case, in contrast to Lasso. We establish various nonasymptotic bounds for√Lasso including prediction norm rate and sparsity. Our analysis is based on new impact factors that are tailored for bounding prediction norm. In order to cover heteroscedastic non-Gaussian noise, we rely on moderate deviation theory for self-normalized sums to achieve Gaussian-like results under weak conditions. Moreover, we derive bounds on the performance of ordinary least square (ols) applied to the model selected by √Lasso accounting for possible misspecification of the selected model. Under mild conditions, the rate of convergence of ols post √Lasso is as good as √Lasso’s rate. As an application, we consider the use of √Lasso and ols post √Lasso as estimators of nuisance parameters in a generic semiparametric problem (nonlinear moment condition or Z-problem), resulting in a construction of √n-consistent and asymptotically normal estimators of the main parameters. National Science Foundation (U.S.) 2015-01-29T15:56:22Z 2015-01-29T15:56:22Z 2014-04 2013-12 Article http://purl.org/eprint/type/JournalArticle 0090-5364 http://hdl.handle.net/1721.1/93187 Belloni, Alexandre, Victor Chernozhukov, and Lie Wang. “Pivotal Estimation via Square-Root Lasso in Nonparametric Regression.” Ann. Statist. 42, no. 2 (April 2014): 757–788. https://orcid.org/0000-0003-3582-8898 https://orcid.org/0000-0002-3250-6714 en_US http://dx.doi.org/10.1214/14-AOS1204 Annals of Statistics Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Institute of Mathematical Statistics arXiv |
spellingShingle | Belloni, Alexandre Wang, Lie Chernozhukov, Victor V. Pivotal estimation via square-root Lasso in nonparametric regression |
title | Pivotal estimation via square-root Lasso in nonparametric regression |
title_full | Pivotal estimation via square-root Lasso in nonparametric regression |
title_fullStr | Pivotal estimation via square-root Lasso in nonparametric regression |
title_full_unstemmed | Pivotal estimation via square-root Lasso in nonparametric regression |
title_short | Pivotal estimation via square-root Lasso in nonparametric regression |
title_sort | pivotal estimation via square root lasso in nonparametric regression |
url | http://hdl.handle.net/1721.1/93187 https://orcid.org/0000-0003-3582-8898 https://orcid.org/0000-0002-3250-6714 |
work_keys_str_mv | AT bellonialexandre pivotalestimationviasquarerootlassoinnonparametricregression AT wanglie pivotalestimationviasquarerootlassoinnonparametricregression AT chernozhukovvictorv pivotalestimationviasquarerootlassoinnonparametricregression |