A fast algorithm for separated sparsity via perturbed lagrangians

Copyright 2018 by the author(s). Sparsity-based methods are widely used in machine learning, statistics, and signal processing. There is now a rich class of structured sparsity approaches that expand the modeling power of the sparsity paradigm and incorporate constraints such as group sparsity, grap...

Full description

Bibliographic Details
Main Authors: Madry, A, Mitrović, S, Schmidt, L
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/137027
_version_ 1826209561343688704
author Madry, A
Mitrović, S
Schmidt, L
author2 Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
author_facet Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Madry, A
Mitrović, S
Schmidt, L
author_sort Madry, A
collection MIT
description Copyright 2018 by the author(s). Sparsity-based methods are widely used in machine learning, statistics, and signal processing. There is now a rich class of structured sparsity approaches that expand the modeling power of the sparsity paradigm and incorporate constraints such as group sparsity, graph sparsity, or hierarchical sparsity. While these sparsity models offer improved sample complexity and better interpretability, the improvements come at a computational cost: it is often challenging to optimize over the (non-convex) constraint sets that capture various sparsity structures. In this paper, we make progress in this direction in the context of separated sparsity – a fundamental sparsity notion that captures exclusion constraints in linearly ordered data such as time series. While prior algorithms for computing a projection onto this constraint set required quadratic time, we provide a perturbed Lagrangian relaxation approach that computes provably exact projection in only nearly-linear time. Although the sparsity constraint is non-convex, our perturbed Lagrangian approach is still guaranteed to find a globally optimal solution. In experiments, our new algorithms offer a 10× speed-up already on moderately-size inputs.
first_indexed 2024-09-23T14:24:16Z
format Article
id mit-1721.1/137027
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T14:24:16Z
publishDate 2021
record_format dspace
spelling mit-1721.1/1370272023-02-10T20:48:05Z A fast algorithm for separated sparsity via perturbed lagrangians Madry, A Mitrović, S Schmidt, L Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Copyright 2018 by the author(s). Sparsity-based methods are widely used in machine learning, statistics, and signal processing. There is now a rich class of structured sparsity approaches that expand the modeling power of the sparsity paradigm and incorporate constraints such as group sparsity, graph sparsity, or hierarchical sparsity. While these sparsity models offer improved sample complexity and better interpretability, the improvements come at a computational cost: it is often challenging to optimize over the (non-convex) constraint sets that capture various sparsity structures. In this paper, we make progress in this direction in the context of separated sparsity – a fundamental sparsity notion that captures exclusion constraints in linearly ordered data such as time series. While prior algorithms for computing a projection onto this constraint set required quadratic time, we provide a perturbed Lagrangian relaxation approach that computes provably exact projection in only nearly-linear time. Although the sparsity constraint is non-convex, our perturbed Lagrangian approach is still guaranteed to find a globally optimal solution. In experiments, our new algorithms offer a 10× speed-up already on moderately-size inputs. 2021-11-01T18:24:31Z 2021-11-01T18:24:31Z 2018 2021-02-05T18:24:34Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137027 Madry, A, Mitrović, S and Schmidt, L. 2018. "A fast algorithm for separated sparsity via perturbed lagrangians." International Conference on Artificial Intelligence and Statistics, AISTATS 2018, 84. en http://proceedings.mlr.press/v84/madry18a/madry18a.pdf International Conference on Artificial Intelligence and Statistics, AISTATS 2018 Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Proceedings of Machine Learning Research
spellingShingle Madry, A
Mitrović, S
Schmidt, L
A fast algorithm for separated sparsity via perturbed lagrangians
title A fast algorithm for separated sparsity via perturbed lagrangians
title_full A fast algorithm for separated sparsity via perturbed lagrangians
title_fullStr A fast algorithm for separated sparsity via perturbed lagrangians
title_full_unstemmed A fast algorithm for separated sparsity via perturbed lagrangians
title_short A fast algorithm for separated sparsity via perturbed lagrangians
title_sort fast algorithm for separated sparsity via perturbed lagrangians
url https://hdl.handle.net/1721.1/137027
work_keys_str_mv AT madrya afastalgorithmforseparatedsparsityviaperturbedlagrangians
AT mitrovics afastalgorithmforseparatedsparsityviaperturbedlagrangians
AT schmidtl afastalgorithmforseparatedsparsityviaperturbedlagrangians
AT madrya fastalgorithmforseparatedsparsityviaperturbedlagrangians
AT mitrovics fastalgorithmforseparatedsparsityviaperturbedlagrangians
AT schmidtl fastalgorithmforseparatedsparsityviaperturbedlagrangians