Lasso-type recovery of sparse representations for high-dimensional data

The Lasso is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables $p_n$ is potentially much larger than the number of samples $n$. However, it was recently discovered that the sparsity pattern of the Lasso estimator can...

Full description

Bibliographic Details
Main Authors: Meinshausen, N, Yu, B
Format: Journal article
Language:English
Published: 2008
_version_ 1797093796377264128
author Meinshausen, N
Yu, B
author_facet Meinshausen, N
Yu, B
author_sort Meinshausen, N
collection OXFORD
description The Lasso is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables $p_n$ is potentially much larger than the number of samples $n$. However, it was recently discovered that the sparsity pattern of the Lasso estimator can only be asymptotically identical to the true sparsity pattern if the design matrix satisfies the so-called irrepresentable condition. The latter condition can easily be violated in the presence of highly correlated variables. Here we examine the behavior of the Lasso estimators if the irrepresentable condition is relaxed. Even though the Lasso cannot recover the correct sparsity pattern, we show that the estimator is still consistent in the $\ell_2$-norm sense for fixed designs under conditions on (a) the number $s_n$ of nonzero components of the vector $\beta_n$ and (b) the minimal singular values of design matrices that are induced by selecting small subsets of variables. Furthermore, a rate of convergence result is obtained on the $\ell_2$ error with an appropriate choice of the smoothing parameter. The rate is shown to be optimal under the condition of bounded maximal and minimal sparse eigenvalues. Our results imply that, with high probability, all important variables are selected. The set of selected variables is a meaningful reduction on the original set of variables. Finally, our results are illustrated with the detection of closely adjacent frequencies, a problem encountered in astrophysics.
first_indexed 2024-03-07T04:05:23Z
format Journal article
id oxford-uuid:c5f66cc5-17c1-4fd2-a66a-3b33e8229ef2
institution University of Oxford
language English
last_indexed 2024-03-07T04:05:23Z
publishDate 2008
record_format dspace
spelling oxford-uuid:c5f66cc5-17c1-4fd2-a66a-3b33e8229ef22022-03-27T06:34:49ZLasso-type recovery of sparse representations for high-dimensional dataJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:c5f66cc5-17c1-4fd2-a66a-3b33e8229ef2EnglishSymplectic Elements at Oxford2008Meinshausen, NYu, BThe Lasso is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables $p_n$ is potentially much larger than the number of samples $n$. However, it was recently discovered that the sparsity pattern of the Lasso estimator can only be asymptotically identical to the true sparsity pattern if the design matrix satisfies the so-called irrepresentable condition. The latter condition can easily be violated in the presence of highly correlated variables. Here we examine the behavior of the Lasso estimators if the irrepresentable condition is relaxed. Even though the Lasso cannot recover the correct sparsity pattern, we show that the estimator is still consistent in the $\ell_2$-norm sense for fixed designs under conditions on (a) the number $s_n$ of nonzero components of the vector $\beta_n$ and (b) the minimal singular values of design matrices that are induced by selecting small subsets of variables. Furthermore, a rate of convergence result is obtained on the $\ell_2$ error with an appropriate choice of the smoothing parameter. The rate is shown to be optimal under the condition of bounded maximal and minimal sparse eigenvalues. Our results imply that, with high probability, all important variables are selected. The set of selected variables is a meaningful reduction on the original set of variables. Finally, our results are illustrated with the detection of closely adjacent frequencies, a problem encountered in astrophysics.
spellingShingle Meinshausen, N
Yu, B
Lasso-type recovery of sparse representations for high-dimensional data
title Lasso-type recovery of sparse representations for high-dimensional data
title_full Lasso-type recovery of sparse representations for high-dimensional data
title_fullStr Lasso-type recovery of sparse representations for high-dimensional data
title_full_unstemmed Lasso-type recovery of sparse representations for high-dimensional data
title_short Lasso-type recovery of sparse representations for high-dimensional data
title_sort lasso type recovery of sparse representations for high dimensional data
work_keys_str_mv AT meinshausenn lassotyperecoveryofsparserepresentationsforhighdimensionaldata
AT yub lassotyperecoveryofsparserepresentationsforhighdimensionaldata