A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization

In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak- Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely B...

Full description

Bibliographic Details
Main Author: Kadir Kiran
Format: Article
Language:English
Published: Croatian Operational Research Society 2022-01-01
Series:Croatian Operational Research Review
Subjects:
Online Access:https://hrcak.srce.hr/file/406247
_version_ 1811223892769898496
author Kadir Kiran
author_facet Kadir Kiran
author_sort Kadir Kiran
collection DOAJ
description In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak- Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions. During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.
first_indexed 2024-04-12T08:40:19Z
format Article
id doaj.art-12537bfad33449949ed84a2d41ee1f0a
institution Directory Open Access Journal
issn 1848-0225
1848-9931
language English
last_indexed 2024-04-12T08:40:19Z
publishDate 2022-01-01
publisher Croatian Operational Research Society
record_format Article
series Croatian Operational Research Review
spelling doaj.art-12537bfad33449949ed84a2d41ee1f0a2022-12-22T03:39:53ZengCroatian Operational Research SocietyCroatian Operational Research Review1848-02251848-99312022-01-011317797A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained OptimizationKadir Kiran0Department of Airframe and Powerplant MaintenanceIn this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and the three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak- Ribiere and Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in the unconstrained optimization. To this end, a series of computational experiments on a test function set is completed using the combinations of those optimization methods and line search conditions. During these experiments, the number of function evaluations for every iteration are monitored and recorded for all the optimization method-line search condition combinations. The total number of function evaluations are then set a performance measure when the combination in question converges to the functions minimums within the given convergence tolerance. Through those data, the performance and data profiles are created for all the optimization method-line search condition combinations with the purpose of a reliable and an efficient benchmarking. It has been determined that, for this test function set, the steepest descent-Goldstein combination is the fastest one whereas the steepest descent-exact local minimizer is the most robust one with a high convergence accuracy. By making a trade-off between convergence speed and robustness, it has been identified that the steepest descent-weak Wolfe combination is the optimal choice for this test function set.https://hrcak.srce.hr/file/406247conjugate gradientline searchstep lengthsteepest descentoptimization
spellingShingle Kadir Kiran
A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
Croatian Operational Research Review
conjugate gradient
line search
step length
steepest descent
optimization
title A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
title_full A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
title_fullStr A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
title_full_unstemmed A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
title_short A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
title_sort benchmark study on steepest descent and conjugate gradient methods line search conditions combinations in unconstrained optimization
topic conjugate gradient
line search
step length
steepest descent
optimization
url https://hrcak.srce.hr/file/406247
work_keys_str_mv AT kadirkiran abenchmarkstudyonsteepestdescentandconjugategradientmethodslinesearchconditionscombinationsinunconstrainedoptimization
AT kadirkiran benchmarkstudyonsteepestdescentandconjugategradientmethodslinesearchconditionscombinationsinunconstrainedoptimization