Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
The worst-case behaviour of a general class of regularization algorithms is considered in the case where only objective function values and associated gradient vectors are evaluated. Upper bounds are derived on the number of such evaluations that are needed for the algorithm to produce an approximat...
Main Authors: | Cartis, C, Gould, N, Toint, P |
---|---|
Format: | Journal article |
Published: |
Taylor and Francis
2017
|
Similar Items
-
Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization.
by: Cartis, C, et al.
Published: (2012) -
Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity.
by: Cartis, C, et al.
Published: (2011) -
Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization
by: Cartis, C, et al.
Published: (2018) -
On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems.
by: Cartis, C, et al.
Published: (2010) -
A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
by: Cartis, C, et al.
Published: (2019)