Global Continuous Optimization with Error Bound and Fast Convergence
This paper considers global optimization with a black-box unknown objective function that can be non-convex and non-differentiable. Such a difficult optimization problem arises in many real-world applications, such as parameter tuning in machine learning, engineering design problem, and planning wit...
Main Authors: | Maruyama, Yu, Zheng, Xiaoyu, Kawaguchi, Kenji |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
Format: | Article |
Language: | en_US |
Published: |
Association for the Advancement of Artificial Intelligence
2017
|
Online Access: | http://hdl.handle.net/1721.1/107756 https://orcid.org/0000-0003-1839-7504 |
Similar Items
-
Bayesian optimization with exponential convergence
by: Kawaguchi, Kenji, et al.
Published: (2018) -
Towards Practical Theory: Bayesian Optimization and Optimal Exploration
by: Kawaguchi, Kenji
Published: (2016) -
European pensions and global finance: continuity or convergence?
by: Clark, G
Published: (2002) -
European Pensions and Global Finance: Continuity or Convergence?
by: Clark, G
Published: (2002) -
Branching and bounding improvements for global optimization algorithms with Lipschitz continuity properties
by: Cartis, C, et al.
Published: (2015)