Factor- $$\sqrt{2}$$ 2 Acceleration of Accelerated Gradient Methods
Abstract The optimized gradient method (OGM) provides a factor- $$\sqrt{2}$$ 2 speedup upon Nestero...
Main Authors: | Park, Chanwoo, Park, Jisun, Ryu, Ernest K. |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
Format: | Article |
Language: | English |
Published: |
Springer US
2023
|
Online Access: | https://hdl.handle.net/1721.1/152276 |
Similar Items
-
Factor-√2 Acceleration of Accelerated Gradient Methods
by: Park, Chanwoo, et al.
Published: (2023) -
Gradient accelerated cosmology
by: Jaime Ruiz-Zapatero
Published: (2024) -
Robust accelerated gradient methods for machine learning
by: Fallah, Alireza.
Published: (2019) -
Implementing accelerated key-value store: From SSDs to datacenter servers
by: Chung, Chanwoo
Published: (2023) -
Hybrid Modified Accelerated Gradient Method for Optimization Processes
by: Milena J. Petrović, et al.
Published: (2024-02-01)