Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
The worst-case behaviour of a general class of regularization algorithms is considered in the case where only objective function values and associated gradient vectors are evaluated. Upper bounds are derived on the number of such evaluations that are needed for the algorithm to produce an approximat...
Main Authors: | Cartis, C, Gould, N, Toint, P |
---|---|
Format: | Journal article |
Published: |
Taylor and Francis
2017
|
Similar Items
-
Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization
by: Cartis, C, et al.
Published: (2018) -
Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity.
by: Cartis, C, et al.
Published: (2011) -
Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization.
by: Cartis, C, et al.
Published: (2012) -
On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems.
by: Cartis, C, et al.
Published: (2010) -
Sharp worst-case evaluation complexity bounds for arbitrary-order nonconvex optimization with inexpensive constraints
by: Cartis, C, et al.
Published: (2020)