Accelerating incremental gradient optimization with curvature information
Abstract This paper studies an acceleration technique for incremental aggregated gradient (IAG) method through the use of curvature information for solving strongly convex finite sum optimization problems. These optimization problems of interest arise in large-scale learning applications. Our techn...
Main Authors: | Wai, Hoi-To, Shi, Wei, Uribe, César A, Nedić, Angelia, Scaglione, Anna |
---|---|
Other Authors: | Massachusetts Institute of Technology. Laboratory for Information and Decision Systems |
Format: | Article |
Language: | English |
Published: |
Springer US
2021
|
Online Access: | https://hdl.handle.net/1721.1/131862 |
Similar Items
-
Constrained Consensus and Optimization in Multi-Agent Networks
by: Nedic, Angelia, et al.
Published: (2011) -
On the Convergence Rate of Incremental Aggregated Gradient Algorithms
by: Gurbuzbalaban, Mert, et al.
Published: (2018) -
A universally optimal multistage accelerated stochastic gradient method
by: Aybat, NS, et al.
Published: (2021) -
Factor-√2 Acceleration of Accelerated Gradient Methods
by: Park, Chanwoo, et al.
Published: (2023) -
Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
by: Vanli, Nuri Denizcan, et al.
Published: (2019)