AG-SGD: Angle-Based Stochastic Gradient Descent

In the field of neural network, stochastic gradient descent is often employed as an effective method of accelerating the result's convergence. Generating the new gradient from the past gradient is a common method adopted by many existing optimization algorithms. Since the past gradient is not c...

Full description

Bibliographic Details
Main Authors: Chongya Song, Alexander Pons, Kang Yen
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9343305/