Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization

<p>Abstract</p> <p>In this paper, an efficient modified nonlinear conjugate gradient method for solving unconstrained optimization problems is proposed. An attractive property of the modified method is that the generated direction in each step is always descending without any line...

Full description

Bibliographic Details
Main Authors: Liu Jinkui, Wang Shaoheng
Format: Article
Language:English
Published: SpringerOpen 2011-01-01
Series:Journal of Inequalities and Applications
Online Access:http://www.journalofinequalitiesandapplications.com/content/2011/1/57
Description
Summary:<p>Abstract</p> <p>In this paper, an efficient modified nonlinear conjugate gradient method for solving unconstrained optimization problems is proposed. An attractive property of the modified method is that the generated direction in each step is always descending without any line search. The global convergence result of the modified method is established under the general Wolfe line search condition. Numerical results show that the modified method is efficient and stationary by comparing with the well-known Polak-Ribi&#233;re-Polyak method, CG-DESCENT method and DSP-CG method using the unconstrained optimization problems from More and Garbow (ACM Trans Math Softw <b>7</b>, 17-41, 1981), so it can be widely used in scientific computation.</p> <p> <b>Mathematics Subject Classification (2010) </b>90C26 &#183; 65H10</p>
ISSN:1025-5834
1029-242X