Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces

09/05/2020
by   Hung Tran-The, et al.
7

Bayesian optimisation is a popular method for efficient optimisation of expensive black-box functions. Traditionally, BO assumes that the search space is known. However, in many problems, this assumption does not hold. To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a hyperharmonic series. Further, we propose another variant of our algorithm that scales to high dimensions. We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates. Our experiments with synthetic and real-world optimisation tasks demonstrate the superiority of our algorithms over the current state-of-the-art methods for Bayesian optimisation in unknown search space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2021

Bayesian Optimistic Optimisation with Exponentially Decaying Regret

Bayesian optimisation (BO) is a well-known efficient algorithm for findi...
research
06/08/2023

Bayesian Optimisation of Functions on Graphs

The increasing availability of graph-structured data motivates the task ...
research
11/27/2019

Trading Convergence Rate with Computational Budget in High Dimensional Bayesian Optimization

Scaling Bayesian optimisation (BO) to high-dimensional search spaces is ...
research
12/05/2019

Ordinal Bayesian Optimisation

Bayesian optimisation is a powerful tool to solve expensive black-box pr...
research
07/22/2016

Optimal resampling for the noisy OneMax problem

The OneMax problem is a standard benchmark optimisation problem for a bi...
research
02/10/2022

Bayesian Optimisation for Mixed-Variable Inputs using Value Proposals

Many real-world optimisation problems are defined over both categorical ...

Please sign up or login with your details

Forgot password? Click here to reset