The Hessian by blocks for neural network by backward propagation

The back-propagation algorithm used with a stochastic gradient and the increase in computer performance are at the origin of the recent Deep learning trend. For some problems, however, the convergence of gradient methods is still very slow. Newton's method offers potential advantages in terms o...

Full description

Bibliographic Details
Main Authors: Radhia Bessi, Nabil Gmati
Format: Article
Language:English
Published: Taylor & Francis Group 2024-12-01
Series:Journal of Taibah University for Science
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/16583655.2024.2327102