Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks
Backpropagation neural networks are commonly utilized to solve complicated issues in various disciplines. However, optimizing their settings remains a significant task. Traditional gradient-based optimization methods, such as stochastic gradient descent (SGD), often exhibit slow convergence and hype...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10445451/ |
_version_ | 1797269141455896576 |
---|---|
author | Ibrahim Abaker Targio Hashem Fadele Ayotunde Alaba Muhammad Haruna Jumare Ashraf Osman Ibrahim Anas Waleed Abulfaraj |
author_facet | Ibrahim Abaker Targio Hashem Fadele Ayotunde Alaba Muhammad Haruna Jumare Ashraf Osman Ibrahim Anas Waleed Abulfaraj |
author_sort | Ibrahim Abaker Targio Hashem |
collection | DOAJ |
description | Backpropagation neural networks are commonly utilized to solve complicated issues in various disciplines. However, optimizing their settings remains a significant task. Traditional gradient-based optimization methods, such as stochastic gradient descent (SGD), often exhibit slow convergence and hyperparameter sensitivity. An adaptive stochastic conjugate gradient (ASCG) optimization strategy for backpropagation neural networks is proposed in this study. ASCG combines the advantages of stochastic optimization and conjugate gradient techniques to increase training efficiency and convergence speed. Based on the observed gradients, the algorithm adaptively calculates the learning rate and search direction at each iteration, allowing for quicker convergence and greater generalization. Experimental findings on benchmark datasets show that ASCG optimization outperforms standard optimization techniques regarding convergence time and model performance. The proposed ASCG algorithm provides a viable method for improving the training process of backpropagation neural networks, making them more successful in tackling complicated problems across several domains. As a result, the information for initial seeds formed while the model is being trained grows. The coordinated efforts of ASCG’s Conjugate Gradient and ASCG components improve learning and achieve global minima. Our results indicate that our ASCG algorithm achieves 21 percent higher accuracy on the HMT dataset and performs better than existing methods on other datasets(DIR-Lab dataset). The experimentation revealed that the conjugate gradient has an efficiency of 95 percent when utilizing the principal component analysis features, compared to 94 percent when using the correlation heatmap features selection approach with MSE of 0.0678. |
first_indexed | 2024-04-25T01:43:39Z |
format | Article |
id | doaj.art-d83df0e4bdae42a88b88c08aafb7cb19 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-25T01:43:39Z |
publishDate | 2024-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-d83df0e4bdae42a88b88c08aafb7cb192024-03-08T00:00:23ZengIEEEIEEE Access2169-35362024-01-0112337573376810.1109/ACCESS.2024.337085910445451Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural NetworksIbrahim Abaker Targio Hashem0https://orcid.org/0000-0001-7611-9540Fadele Ayotunde Alaba1https://orcid.org/0000-0002-5842-1452Muhammad Haruna Jumare2https://orcid.org/0009-0000-7217-8885Ashraf Osman Ibrahim3https://orcid.org/0000-0001-5526-3623Anas Waleed Abulfaraj4Computer Science Department, University of Sharjah, Sharjah, United Arab EmiratesDepartment of Computer Science Education, Federal College of Education, Zaria, Kaduna, NigeriaDepartment of Computer Science Education, Federal College of Education, Zaria, Kaduna, NigeriaCreative Advanced Machine Intelligence Research Centre, Faculty of Computing and Informatics, Universiti Malaysia Sabah, Kota Kinabalu, MalaysiaDepartment of Information Systems, King Abdulaziz University, Rabigh, Saudi ArabiaBackpropagation neural networks are commonly utilized to solve complicated issues in various disciplines. However, optimizing their settings remains a significant task. Traditional gradient-based optimization methods, such as stochastic gradient descent (SGD), often exhibit slow convergence and hyperparameter sensitivity. An adaptive stochastic conjugate gradient (ASCG) optimization strategy for backpropagation neural networks is proposed in this study. ASCG combines the advantages of stochastic optimization and conjugate gradient techniques to increase training efficiency and convergence speed. Based on the observed gradients, the algorithm adaptively calculates the learning rate and search direction at each iteration, allowing for quicker convergence and greater generalization. Experimental findings on benchmark datasets show that ASCG optimization outperforms standard optimization techniques regarding convergence time and model performance. The proposed ASCG algorithm provides a viable method for improving the training process of backpropagation neural networks, making them more successful in tackling complicated problems across several domains. As a result, the information for initial seeds formed while the model is being trained grows. The coordinated efforts of ASCG’s Conjugate Gradient and ASCG components improve learning and achieve global minima. Our results indicate that our ASCG algorithm achieves 21 percent higher accuracy on the HMT dataset and performs better than existing methods on other datasets(DIR-Lab dataset). The experimentation revealed that the conjugate gradient has an efficiency of 95 percent when utilizing the principal component analysis features, compared to 94 percent when using the correlation heatmap features selection approach with MSE of 0.0678.https://ieeexplore.ieee.org/document/10445451/Adaptive stochastic conjugate gradientbackpropagationneural networksstochastic gradient descent |
spellingShingle | Ibrahim Abaker Targio Hashem Fadele Ayotunde Alaba Muhammad Haruna Jumare Ashraf Osman Ibrahim Anas Waleed Abulfaraj Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks IEEE Access Adaptive stochastic conjugate gradient backpropagation neural networks stochastic gradient descent |
title | Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks |
title_full | Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks |
title_fullStr | Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks |
title_full_unstemmed | Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks |
title_short | Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks |
title_sort | adaptive stochastic conjugate gradient optimization for backpropagation neural networks |
topic | Adaptive stochastic conjugate gradient backpropagation neural networks stochastic gradient descent |
url | https://ieeexplore.ieee.org/document/10445451/ |
work_keys_str_mv | AT ibrahimabakertargiohashem adaptivestochasticconjugategradientoptimizationforbackpropagationneuralnetworks AT fadeleayotundealaba adaptivestochasticconjugategradientoptimizationforbackpropagationneuralnetworks AT muhammadharunajumare adaptivestochasticconjugategradientoptimizationforbackpropagationneuralnetworks AT ashrafosmanibrahim adaptivestochasticconjugategradientoptimizationforbackpropagationneuralnetworks AT anaswaleedabulfaraj adaptivestochasticconjugategradientoptimizationforbackpropagationneuralnetworks |