On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems.
It is shown that the steepest-descent and Newton's methods for unconstrained nonconvex optimization under standard assumptions may both require a number of iterations and function evaluations arbitrarily close to O(ε ?2) to drive the norm of the gradient below ε. This shows that the upper bound...
Main Authors: | , , |
---|---|
Format: | Journal article |
Language: | English |
Published: |
2010
|