Conditional gradient methods via stochastic path-integrated differential estimator
We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as wel...
Main Author: | Sra, Suvrit |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
Format: | Article |
Language: | English |
Published: |
International Machine Learning Society
2021
|
Online Access: | https://hdl.handle.net/1721.1/130530 |
Similar Items
-
Escaping Saddle Points with Adaptive Gradient Methods
by: Staib, Matthew, et al.
Published: (2021) -
Riemannian Optimization via Frank-Wolfe Methods
by: Weber, Melanie, et al.
Published: (2022) -
Projection-free nonconvex stochastic optimization on Riemannian manifolds
by: Weber, Melanie, et al.
Published: (2022) -
An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization
by: Hosseini, Reshad, et al.
Published: (2021) -
A gradient method for optimizing stochastic systems
by: Fitzgerald, Robert John
Published: (2013)