Improved Three-term Conjugate Gradient Algorithm For Training Neural Network

A new three-term conjugate gradient algorithm for training feed-forward neural networks is developed. It is a vector based training algorithm derived from DFP quasi-Newton and has only O(n) memory. The global convergence to the proposed algorithm has been established for convex function under W...

Full description

Bibliographic Details
Main Author: Abbas H. Taqi
Format: Article
Language:English
Published: Faculty of Computer Science and Mathematics, University of Kufa 2015-06-01
Series:Journal of Kufa for Mathematics and Computer
Subjects:
Online Access:https://journal.uokufa.edu.iq/index.php/jkmc/article/view/2137
Description
Summary:A new three-term conjugate gradient algorithm for training feed-forward neural networks is developed. It is a vector based training algorithm derived from DFP quasi-Newton and has only O(n) memory. The global convergence to the proposed algorithm has been established for convex function under Wolfe condition. The results of numerical experiments are included and compared with other well known training algorithms in this field.
ISSN:2076-1171
2518-0010