Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks

Backpropagation neural networks are commonly utilized to solve complicated issues in various disciplines. However, optimizing their settings remains a significant task. Traditional gradient-based optimization methods, such as stochastic gradient descent (SGD), often exhibit slow convergence and hype...

Popoln opis

Bibliografske podrobnosti
Main Authors: Ibrahim Abaker Targio Hashem, Fadele Ayotunde Alaba, Muhammad Haruna Jumare, Ashraf Osman Ibrahim, Anas Waleed Abulfaraj
Format: Article
Jezik:English
Izdano: IEEE 2024-01-01
Serija:IEEE Access
Teme:
Online dostop:https://ieeexplore.ieee.org/document/10445451/