论文标题

自适应扩展贝叶斯优化无限制的全球优化

Adaptive Expansion Bayesian Optimization for Unbounded Global Optimization

论文作者

Chen, Wei, Fuge, Mark

论文摘要

贝叶斯优化通常在固定变量边界内执行。如果用于机器学习算法的高参数调整,则设置可变边界并不是微不足道的。很难确保任何固定的界限都将包括真正的全局最佳最佳。我们提出了一种贝叶斯优化方法,该方法仅需要指定不一定包含全局最优的初始搜索空间,并在必要时扩展搜索空间。但是,在搜索空间扩展过程中可能会发生过度探索。我们的方法可以在扩展的空间中适应平衡探索和剥削。在一系列合成测试功能和MLP超参数优化任务上的结果表明,所提出的方法表现出色或至少与当前的最新方法一样好。

Bayesian optimization is normally performed within fixed variable bounds. In cases like hyperparameter tuning for machine learning algorithms, setting the variable bounds is not trivial. It is hard to guarantee that any fixed bounds will include the true global optimum. We propose a Bayesian optimization approach that only needs to specify an initial search space that does not necessarily include the global optimum, and expands the search space when necessary. However, over-exploration may occur during the search space expansion. Our method can adaptively balance exploration and exploitation in an expanding space. Results on a range of synthetic test functions and an MLP hyperparameter optimization task show that the proposed method out-performs or at least as good as the current state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源