Relaxed Lasso.

The Lasso is an attractive regularisation method for high-dimensional regression. It combines variable selection with an efficient computational procedure. However, the rate of convergence of the Lasso is slow for some sparse high-dimensional data, where the number of predictor variables is growing...

Full description

Bibliographic Details
Main Author: Meinshausen, N
Format: Journal article
Language:English
Published: 2007
_version_ 1797075961480478720
author Meinshausen, N
author_facet Meinshausen, N
author_sort Meinshausen, N
collection OXFORD
description The Lasso is an attractive regularisation method for high-dimensional regression. It combines variable selection with an efficient computational procedure. However, the rate of convergence of the Lasso is slow for some sparse high-dimensional data, where the number of predictor variables is growing fast with the number of observations. Moreover, many noise variables are selected if the estimator is chosen by cross-validation. It is shown that the contradicting demands of an efficient computational procedure and fast convergence rates of the ℓ2-loss can be overcome by a two-stage procedure, termed the relaxed Lasso. For orthogonal designs, the relaxed Lasso provides a continuum of solutions that include both soft- and hard-thresholding of estimators. The relaxed Lasso solutions include all regular Lasso solutions and computation of all relaxed Lasso solutions is often identically expensive as computing all regular Lasso solutions. Theoretical and numerical results demonstrate that the relaxed Lasso produces sparser models with equal or lower prediction loss than the regular Lasso estimator for high-dimensional data. © 2007 Elsevier B.V. All rights reserved.
first_indexed 2024-03-06T23:57:33Z
format Journal article
id oxford-uuid:74c43217-b139-4e4b-b0f5-cc68c82d2f93
institution University of Oxford
language English
last_indexed 2024-03-06T23:57:33Z
publishDate 2007
record_format dspace
spelling oxford-uuid:74c43217-b139-4e4b-b0f5-cc68c82d2f932022-03-26T20:05:04ZRelaxed Lasso.Journal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:74c43217-b139-4e4b-b0f5-cc68c82d2f93EnglishSymplectic Elements at Oxford2007Meinshausen, NThe Lasso is an attractive regularisation method for high-dimensional regression. It combines variable selection with an efficient computational procedure. However, the rate of convergence of the Lasso is slow for some sparse high-dimensional data, where the number of predictor variables is growing fast with the number of observations. Moreover, many noise variables are selected if the estimator is chosen by cross-validation. It is shown that the contradicting demands of an efficient computational procedure and fast convergence rates of the ℓ2-loss can be overcome by a two-stage procedure, termed the relaxed Lasso. For orthogonal designs, the relaxed Lasso provides a continuum of solutions that include both soft- and hard-thresholding of estimators. The relaxed Lasso solutions include all regular Lasso solutions and computation of all relaxed Lasso solutions is often identically expensive as computing all regular Lasso solutions. Theoretical and numerical results demonstrate that the relaxed Lasso produces sparser models with equal or lower prediction loss than the regular Lasso estimator for high-dimensional data. © 2007 Elsevier B.V. All rights reserved.
spellingShingle Meinshausen, N
Relaxed Lasso.
title Relaxed Lasso.
title_full Relaxed Lasso.
title_fullStr Relaxed Lasso.
title_full_unstemmed Relaxed Lasso.
title_short Relaxed Lasso.
title_sort relaxed lasso
work_keys_str_mv AT meinshausenn relaxedlasso