An adaptive Bayesian approach to gradient-free global optimization
Many problems in science and technology require finding global minima or maxima of complicated objective functions. The importance of global optimization has inspired the development of numerous heuristic algorithms based on analogies with physical, chemical or biological systems. Here we present a...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IOP Publishing
2024-01-01
|
Series: | New Journal of Physics |
Subjects: | |
Online Access: | https://doi.org/10.1088/1367-2630/ad23a3 |
_version_ | 1797310818155495424 |
---|---|
author | Jianneng Yu Alexandre V Morozov |
author_facet | Jianneng Yu Alexandre V Morozov |
author_sort | Jianneng Yu |
collection | DOAJ |
description | Many problems in science and technology require finding global minima or maxima of complicated objective functions. The importance of global optimization has inspired the development of numerous heuristic algorithms based on analogies with physical, chemical or biological systems. Here we present a novel algorithm, SmartRunner, which employs a Bayesian probabilistic model informed by the history of accepted and rejected moves to make an informed decision about the next random trial. Thus, SmartRunner intelligently adapts its search strategy to a given objective function and moveset, with the goal of maximizing fitness gain (or energy loss) per function evaluation. Our approach is equivalent to adding a simple adaptive penalty to the original objective function, with SmartRunner performing hill ascent on the modified landscape. The adaptive penalty can be added to many other global optimization schemes, enhancing their ability to find high-quality solutions. We have explored SmartRunner’s performance on a standard set of test functions, the Sherrington–Kirkpatrick spin glass model, and Kauffman’s NK fitness model, finding that it compares favorably with several widely-used alternative approaches to gradient-free optimization. |
first_indexed | 2024-03-08T01:49:28Z |
format | Article |
id | doaj.art-efd0d3d98cce4972ac2ef47cdba000a0 |
institution | Directory Open Access Journal |
issn | 1367-2630 |
language | English |
last_indexed | 2024-03-08T01:49:28Z |
publishDate | 2024-01-01 |
publisher | IOP Publishing |
record_format | Article |
series | New Journal of Physics |
spelling | doaj.art-efd0d3d98cce4972ac2ef47cdba000a02024-02-14T11:32:19ZengIOP PublishingNew Journal of Physics1367-26302024-01-0126202302710.1088/1367-2630/ad23a3An adaptive Bayesian approach to gradient-free global optimizationJianneng Yu0Alexandre V Morozov1https://orcid.org/0000-0003-2598-7000Department of Physics and Astronomy, Rutgers University , Piscataway, NJ 08854, United States of America; Center for Quantitative Biology, Rutgers University , Piscataway, NJ 08854, United States of AmericaDepartment of Physics and Astronomy, Rutgers University , Piscataway, NJ 08854, United States of America; Center for Quantitative Biology, Rutgers University , Piscataway, NJ 08854, United States of AmericaMany problems in science and technology require finding global minima or maxima of complicated objective functions. The importance of global optimization has inspired the development of numerous heuristic algorithms based on analogies with physical, chemical or biological systems. Here we present a novel algorithm, SmartRunner, which employs a Bayesian probabilistic model informed by the history of accepted and rejected moves to make an informed decision about the next random trial. Thus, SmartRunner intelligently adapts its search strategy to a given objective function and moveset, with the goal of maximizing fitness gain (or energy loss) per function evaluation. Our approach is equivalent to adding a simple adaptive penalty to the original objective function, with SmartRunner performing hill ascent on the modified landscape. The adaptive penalty can be added to many other global optimization schemes, enhancing their ability to find high-quality solutions. We have explored SmartRunner’s performance on a standard set of test functions, the Sherrington–Kirkpatrick spin glass model, and Kauffman’s NK fitness model, finding that it compares favorably with several widely-used alternative approaches to gradient-free optimization.https://doi.org/10.1088/1367-2630/ad23a3gradient-free global optimizationglobal optimization algorithmsspin glassesfitness landscapes |
spellingShingle | Jianneng Yu Alexandre V Morozov An adaptive Bayesian approach to gradient-free global optimization New Journal of Physics gradient-free global optimization global optimization algorithms spin glasses fitness landscapes |
title | An adaptive Bayesian approach to gradient-free global optimization |
title_full | An adaptive Bayesian approach to gradient-free global optimization |
title_fullStr | An adaptive Bayesian approach to gradient-free global optimization |
title_full_unstemmed | An adaptive Bayesian approach to gradient-free global optimization |
title_short | An adaptive Bayesian approach to gradient-free global optimization |
title_sort | adaptive bayesian approach to gradient free global optimization |
topic | gradient-free global optimization global optimization algorithms spin glasses fitness landscapes |
url | https://doi.org/10.1088/1367-2630/ad23a3 |
work_keys_str_mv | AT jiannengyu anadaptivebayesianapproachtogradientfreeglobaloptimization AT alexandrevmorozov anadaptivebayesianapproachtogradientfreeglobaloptimization AT jiannengyu adaptivebayesianapproachtogradientfreeglobaloptimization AT alexandrevmorozov adaptivebayesianapproachtogradientfreeglobaloptimization |