Mining Robust Default Configurations for Resource-constrained AutoML

02/20/2022
by   Moe Kayali, et al.
0

Automatic machine learning (AutoML) is a key enabler of the mass deployment of the next generation of machine learning systems. A key desideratum for future ML systems is the automatic selection of models and hyperparameters. We present a novel method of selecting performant configurations for a given task by performing offline autoML and mining over a diverse set of tasks. By mining the training tasks, we can select a compact portfolio of configurations that perform well over a wide variety of tasks, as well as learn a strategy to select portfolio configurations for yet-unseen tasks. The algorithm runs in a zero-shot manner, that is without training any models online except the chosen one. In a compute- or time-constrained setting, this virtually instant selection is highly performant. Further, we show that our approach is effective for warm-starting existing autoML platforms. In both settings, we demonstrate an improvement on the state-of-the-art by testing over 62 classification and regression datasets. We also demonstrate the utility of recommending data-dependent default configurations that outperform widely used hand-crafted defaults.

READ FULL TEXT
research
06/10/2021

Meta-Learning for Symbolic Hyperparameter Defaults

Hyperparameter optimization in machine learning (ML) deals with the prob...
research
11/23/2018

Learning Multiple Defaults for Machine Learning Algorithms

The performance of modern machine learning methods highly depends on the...
research
09/09/2022

Improving Nevergrad's Algorithm Selection Wizard NGOpt through Automated Algorithm Configuration

Algorithm selection wizards are effective and versatile tools that autom...
research
12/17/2022

Improving Cross-task Generalization of Unified Table-to-text Models with Compositional Task Configurations

There has been great progress in unifying various table-to-text tasks us...
research
04/04/2023

Predicting the Performance-Cost Trade-off of Applications Across Multiple Systems

In modern computing environments, users may have multiple systems access...
research
07/27/2020

Practical and sample efficient zero-shot HPO

Zero-shot hyperparameter optimization (HPO) is a simple yet effective us...
research
02/13/2023

A Simple Zero-shot Prompt Weighting Technique to Improve Prompt Ensembling in Text-Image Models

Contrastively trained text-image models have the remarkable ability to p...

Please sign up or login with your details

Forgot password? Click here to reset