A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization

In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak- Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely B...

Full description

Bibliographic Details
Main Author: Kadir Kiran
Format: Article
Language:English
Published: Croatian Operational Research Society 2022-01-01
Series:Croatian Operational Research Review
Subjects:
Online Access:https://hrcak.srce.hr/file/406247
Description
Summary:In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak- Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions. During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.
ISSN:1848-0225
1848-9931