Multi-step Planning for Automated Hyperparameter Optimization with OptFormer

10/10/2022
by   Lucio M. Dery, et al.
10

As machine learning permeates more industries and models become more expensive and time consuming to train, the need for efficient automated hyperparameter optimization (HPO) has never been more pressing. Multi-step planning based approaches to hyperparameter optimization promise improved efficiency over myopic alternatives by more effectively balancing out exploration and exploitation. However, the potential of these approaches has not been fully realized due to their technical complexity and computational intensity. In this work, we leverage recent advances in Transformer-based, natural-language-interfaced hyperparameter optimization to circumvent these barriers. We build on top of the recently proposed OptFormer which casts both hyperparameter suggestion and target function approximation as autoregressive generation thus making planning via rollouts simple and efficient. We conduct extensive exploration of different strategies for performing multi-step planning on top of the OptFormer model to highlight its potential for use in constructing non-myopic HPO strategies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2020

Multi-Source Unsupervised Hyperparameter Optimization

How can we conduct efficient hyperparameter optimization for a completel...
research
09/08/2020

Hyperparameter Optimization via Sequential Uniform Designs

Hyperparameter tuning or optimization plays a central role in the automa...
research
04/28/2022

A Collection of Quality Diversity Optimization Problems Derived from Hyperparameter Optimization of Machine Learning Models

The goal of Quality Diversity Optimization is to generate a collection o...
research
02/24/2022

DC and SA: Robust and Efficient Hyperparameter Optimization of Multi-subnetwork Deep Learning Models

We present two novel hyperparameter optimization strategies for optimiza...
research
02/26/2020

PHS: A Toolbox for Parellel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...
research
02/01/2023

HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural Networks

Computational chemistry has become an important tool to predict and unde...
research
07/28/2023

Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?

Hyperparameter optimization (HPO) is crucial for fine-tuning machine lea...

Please sign up or login with your details

Forgot password? Click here to reset