PID controller‐based adaptive gradient optimizer for deep neural networks

Abstract Due to improper selection of gradient update direction or learning rate, SGD optimization algorithms for deep learning suffer from oscillation and slow convergence. Although Adam algorithm can adaptively adjust the update direction and learning rate at the same time, it still has the oversh...

Full description

Bibliographic Details
Main Authors: Mingjun Dai, Zelong Zhang, Xiong Lai, Xiaohui Lin, Hui Wang
Format: Article
Language:English
Published: Wiley 2023-10-01
Series:IET Control Theory & Applications
Subjects:
Online Access:https://doi.org/10.1049/cth2.12404