Implicit regularization for optimal sparse recovery

We investigate implicit regularization schemes for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption. For a given parametrizati...

Full description

Bibliographic Details
Main Authors: Vaškevičius, T, Kanade, V, Rebeschini, P
Format: Conference item
Language:English
Published: Neural Information Processing Systems Foundation 2019
_version_ 1826274781106798592
author Vaškevičius, T
Kanade, V
Rebeschini, P
author_facet Vaškevičius, T
Kanade, V
Rebeschini, P
author_sort Vaškevičius, T
collection OXFORD
description We investigate implicit regularization schemes for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption. For a given parametrization yielding a non-convex optimization problem, we show that prescribed choices of initialization, step size and stopping time yield a statistically and computationally optimal algorithm that achieves the minimax rate with the same cost required to read the data up to poly-logarithmic factors. Beyond minimax optimality, we show that our algorithm adapts to instance difficulty and yields a dimension-independent rate when the signal-to-noise ratio is high enough. Key to the computational efficiency of our method is an increasing step size scheme that adapts to refined estimates of the true solution. We validate our findings with numerical experiments and compare our algorithm against explicit ℓ1 penalization. Going from hard instances to easy ones, our algorithm is seen to undergo a phase transition, eventually matching least squares with an oracle knowledge of the true support.
first_indexed 2024-03-06T22:48:39Z
format Conference item
id oxford-uuid:5e0ecefe-626a-49a1-baea-f9b237165115
institution University of Oxford
language English
last_indexed 2024-03-06T22:48:39Z
publishDate 2019
publisher Neural Information Processing Systems Foundation
record_format dspace
spelling oxford-uuid:5e0ecefe-626a-49a1-baea-f9b2371651152022-03-26T17:38:15ZImplicit regularization for optimal sparse recoveryConference itemhttp://purl.org/coar/resource_type/c_5794uuid:5e0ecefe-626a-49a1-baea-f9b237165115EnglishSymplectic Elements at OxfordNeural Information Processing Systems Foundation2019Vaškevičius, TKanade, VRebeschini, PWe investigate implicit regularization schemes for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption. For a given parametrization yielding a non-convex optimization problem, we show that prescribed choices of initialization, step size and stopping time yield a statistically and computationally optimal algorithm that achieves the minimax rate with the same cost required to read the data up to poly-logarithmic factors. Beyond minimax optimality, we show that our algorithm adapts to instance difficulty and yields a dimension-independent rate when the signal-to-noise ratio is high enough. Key to the computational efficiency of our method is an increasing step size scheme that adapts to refined estimates of the true solution. We validate our findings with numerical experiments and compare our algorithm against explicit ℓ1 penalization. Going from hard instances to easy ones, our algorithm is seen to undergo a phase transition, eventually matching least squares with an oracle knowledge of the true support.
spellingShingle Vaškevičius, T
Kanade, V
Rebeschini, P
Implicit regularization for optimal sparse recovery
title Implicit regularization for optimal sparse recovery
title_full Implicit regularization for optimal sparse recovery
title_fullStr Implicit regularization for optimal sparse recovery
title_full_unstemmed Implicit regularization for optimal sparse recovery
title_short Implicit regularization for optimal sparse recovery
title_sort implicit regularization for optimal sparse recovery
work_keys_str_mv AT vaskeviciust implicitregularizationforoptimalsparserecovery
AT kanadev implicitregularizationforoptimalsparserecovery
AT rebeschinip implicitregularizationforoptimalsparserecovery