The Improved Stochastic Fractional Order Gradient Descent Algorithm

This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are...

Full description

Bibliographic Details
Main Authors: Yang Yang, Lipo Mo, Yusen Hu, Fei Long
Format: Article
Language:English
Published: MDPI AG 2023-08-01
Series:Fractal and Fractional
Subjects:
Online Access:https://www.mdpi.com/2504-3110/7/8/631