Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions

© 2020 Society for Industrial and Applied Mathematics. We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm. We focus on gradient descent and accelerated gradient (AG) methods for minimizing strongly convex functions when the gradien...

Full description

Bibliographic Details
Main Authors: Aybat, Necdet Serhat, Fallah, Alireza, Gürbüzbalaban, Mert, Ozdaglar, Asuman
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: Society for Industrial & Applied Mathematics (SIAM) 2021
Online Access:https://hdl.handle.net/1721.1/133761