Factor-√2 Acceleration of Accelerated Gradient Methods

Abstract The optimized gradient method (OGM) provides a factor- $$\sqrt{2}$$ 2 speedup upon Nestero...

Full description

Bibliographic Details
Main Authors: Park, Chanwoo, Park, Jisun, Ryu, Ernest K.
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:English
Published: Springer US 2023
Online Access:https://hdl.handle.net/1721.1/152925
Description
Summary:Abstract The optimized gradient method (OGM) provides a factor- $$\sqrt{2}$$ 2 speedup upon Nesterov’s celebrated accelerated gradient method in the convex (but non-strongly convex) setup. However, this improved acceleration mechanism has not been well understood; prior analyses of OGM relied on a computer-assisted proof methodology, so the proofs were opaque for humans despite being verifiable and correct. In this work, we present a new analysis of OGM based on a Lyapunov function and linear coupling. These analyses are developed and presented without the assistance of computers and are understandable by humans. Furthermore, we generalize OGM’s acceleration mechanism and obtain a factor- $$\sqrt{2}$$ 2 speedup in other setups: acceleration with a simpler rational stepsize, the strongly convex setup, and the mirror descent setup.