AdaCB: An Adaptive Gradient Method with Convergence Range Bound of Learning Rate

Adaptive gradient descent methods such as Adam, RMSprop, and AdaGrad achieve great success in training deep learning models. These methods adaptively change the learning rates, resulting in a faster convergence speed. Recent studies have shown their problems include extreme learning rates, non-conve...

Full description

Bibliographic Details
Main Authors: Xuanzhi Liao, Shahnorbanun Sahran, Azizi Abdullah, Syaimak Abdul Shukor
Format: Article
Language:English
Published: MDPI AG 2022-09-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/12/18/9389