A universally optimal multistage accelerated stochastic gradient method

© 2019 Neural information processing systems foundation. All rights reserved. We study the problem of minimizing a strongly convex, smooth function when we have noisy estimates of its gradient. We propose a novel multistage accelerated algorithm that is universally optimal in the sense that it achie...

Full description

Bibliographic Details
Main Authors: Aybat, NS, Fallah, A, Gürbüzbalaban, M, Ozdaglar, A
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/137365