Multi-step Planning for Automated Hyperparameter Optimization with OptFormer

by   Lucio M. Dery, et al.
Carnegie Mellon University

As machine learning permeates more industries and models become more expensive and time consuming to train, the need for efficient automated hyperparameter optimization (HPO) has never been more pressing. Multi-step planning based approaches to hyperparameter optimization promise improved efficiency over myopic alternatives by more effectively balancing out exploration and exploitation. However, the potential of these approaches has not been fully realized due to their technical complexity and computational intensity. In this work, we leverage recent advances in Transformer-based, natural-language-interfaced hyperparameter optimization to circumvent these barriers. We build on top of the recently proposed OptFormer which casts both hyperparameter suggestion and target function approximation as autoregressive generation thus making planning via rollouts simple and efficient. We conduct extensive exploration of different strategies for performing multi-step planning on top of the OptFormer model to highlight its potential for use in constructing non-myopic HPO strategies.


page 1

page 2

page 3

page 4


Multi-Source Unsupervised Hyperparameter Optimization

How can we conduct efficient hyperparameter optimization for a completel...

Hyperparameter Optimization via Sequential Uniform Designs

Hyperparameter tuning or optimization plays a central role in the automa...

DC and SA: Robust and Efficient Hyperparameter Optimization of Multi-subnetwork Deep Learning Models

We present two novel hyperparameter optimization strategies for optimiza...

PHS: A Toolbox for Parellel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...

HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural Networks

Computational chemistry has become an important tool to predict and unde...

Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?

Hyperparameter optimization (HPO) is crucial for fine-tuning machine lea...

Please sign up or login with your details

Forgot password? Click here to reset