Scaled three-term conjugate gradient method via Davidon-Fletcher-Powell update for unconstrained optimization

This thesis focus on the development of Scaled Three-Term Conjugate Gradient Method via the Davidon-Fletcher-Powell (DFF) quasi-Newton update for unconstrained optimization. The DFP method possess the merits of Newton’s method and steepest descent method while overcoming their disadvantages. O...

Descrizione completa

Dettagli Bibliografici
Autore principale: Ibrahim, Arzuka
Natura: Tesi
Lingua:English
Pubblicazione: 2015
Soggetti:
Accesso online:http://psasir.upm.edu.my/id/eprint/67646/1/IPM%202015%2018%20IR.pdf
Descrizione
Riassunto:This thesis focus on the development of Scaled Three-Term Conjugate Gradient Method via the Davidon-Fletcher-Powell (DFF) quasi-Newton update for unconstrained optimization. The DFP method possess the merits of Newton’s method and steepest descent method while overcoming their disadvantages. Over the years the DFP update has been neglected as a result of lacking the self correcting property for bad Hessian approximation. In this thesis, we proposed a Scaled Three-Term Conjugate Gradient Method by utilizing the DFP update for the inverse Hessian approximation via memoryless quasi Newton’s method which satisfies both the sufficient descent and the conjugacy conditions. The basic philosophy is to restart the DFP update with a multiple of identity matrix in every iteration. An acceleration scheme is incorporated in the proposed method to enhance reduction in function value. Numerical results from an implementation of the proposed method on some standard unconstrained optimization problem shows that the proposed method is promising and exhibits superior numerical performance in comparison with other well-known conjugate gradient methods.