Conditional gradient methods via stochastic path-integrated differential estimator

We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as wel...

Full description

Bibliographic Details
Main Author: Sra, Suvrit
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: International Machine Learning Society 2021
Online Access:https://hdl.handle.net/1721.1/130530
_version_ 1826212930884993024
author Sra, Suvrit
author2 Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
author_facet Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Sra, Suvrit
author_sort Sra, Suvrit
collection MIT
description We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework à la conditional gradient sliding (CGS) of Lan & Zhou (2016), and propose SPIDER-CGS.
first_indexed 2024-09-23T15:40:25Z
format Article
id mit-1721.1/130530
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T15:40:25Z
publishDate 2021
publisher International Machine Learning Society
record_format dspace
spelling mit-1721.1/1305302022-10-02T03:21:35Z Conditional gradient methods via stochastic path-integrated differential estimator Sra, Suvrit Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework à la conditional gradient sliding (CGS) of Lan & Zhou (2016), and propose SPIDER-CGS. National Science Foundation (U.S.) (Grant 200021178865/1) National Science Foundation (U.S.). Career (Grant 1846088) 2021-04-27T17:11:23Z 2021-04-27T17:11:23Z 2019-06 2021-04-06T18:34:21Z Article http://purl.org/eprint/type/ConferencePaper 2640-3498 https://hdl.handle.net/1721.1/130530 Yurtsever, Alp et al. “Conditional gradient methods via stochastic path-integrated differential estimator.” Paper in the Proceedings of Machine Learning Research, 97, 36th International Conference on Machine Learning ICML 2019, Long Beach, California, 9-15 June 2019, American Society of Mechanical Engineers: 7282-7291 © 2019 The Author(s) en Proceedings of Machine Learning Research Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf International Machine Learning Society Proceedings of Machine Learning Research
spellingShingle Sra, Suvrit
Conditional gradient methods via stochastic path-integrated differential estimator
title Conditional gradient methods via stochastic path-integrated differential estimator
title_full Conditional gradient methods via stochastic path-integrated differential estimator
title_fullStr Conditional gradient methods via stochastic path-integrated differential estimator
title_full_unstemmed Conditional gradient methods via stochastic path-integrated differential estimator
title_short Conditional gradient methods via stochastic path-integrated differential estimator
title_sort conditional gradient methods via stochastic path integrated differential estimator
url https://hdl.handle.net/1721.1/130530
work_keys_str_mv AT srasuvrit conditionalgradientmethodsviastochasticpathintegrateddifferentialestimator