HyperBO+: Pre-training a universal prior for Bayesian optimization with hierarchical Gaussian processes

12/20/2022
by   Zhou Fan, et al.
0

Bayesian optimization (BO), while proved highly effective for many black-box function optimization tasks, requires practitioners to carefully select priors that well model their functions of interest. Rather than specifying by hand, researchers have investigated transfer learning based methods to automatically learn the priors, e.g. multi-task BO (Swersky et al., 2013), few-shot BO (Wistuba and Grabocka, 2021) and HyperBO (Wang et al., 2022). However, those prior learning methods typically assume that the input domains are the same for all tasks, weakening their ability to use observations on functions with different domains or generalize the learned priors to BO on different search spaces. In this work, we present HyperBO+: a pre-training approach for hierarchical Gaussian processes that enables the same prior to work universally for Bayesian optimization on functions with different domains. We propose a two-step pre-training method and analyze its appealing asymptotic properties and benefits to BO both theoretically and empirically. On real-world hyperparameter tuning tasks that involve multiple search spaces, we demonstrate that HyperBO+ is able to generalize to unseen search spaces and achieves lower regrets than competitive baselines.

READ FULL TEXT

page 9

page 18

page 19

research
07/07/2022

Pre-training helps Bayesian optimization too

Bayesian optimization (BO) has become a popular strategy for global opti...
research
08/09/2023

Efficient Bayesian Optimization with Deep Kernel Learning and Transformer Pre-trained on Multiple Heterogeneous Datasets

Bayesian optimization (BO) is widely adopted in black-box optimization p...
research
12/21/2021

Provable Hierarchical Lifelong Learning with a Sketch-based Modular Architecture

We propose a modular architecture for the lifelong learning of hierarchi...
research
11/22/2021

Transfer Learning with Gaussian Processes for Bayesian Optimization

Bayesian optimization is a powerful paradigm to optimize black-box funct...
research
10/17/2022

Conditional Neural Processes for Molecules

Neural processes (NPs) are models for transfer learning with properties ...
research
09/20/2017

Bayesian Optimization with Automatic Prior Selection for Data-Efficient Direct Policy Search

One of the most interesting features of Bayesian optimization for direct...
research
07/17/2017

Learning to select data for transfer learning with Bayesian Optimization

Domain similarity measures can be used to gauge adaptability and select ...

Please sign up or login with your details

Forgot password? Click here to reset