Hyperparameter Optimization as a Service on INFN Cloud

01/13/2023
by   Matteo Barbetti, et al.
INFN
0

The simplest and often most effective way of parallelizing the training of complex machine learning models is to execute several training instances on multiple machines, possibly scanning the hyperparameter space to optimize the underlying statistical model and the learning procedure. Often, such a meta learning procedure is limited by the ability of accessing securely a common database organizing the knowledge of the previous and ongoing trials. Exploiting opportunistic GPUs provided in different environments represents a further challenge when designing such optimization campaigns. In this contribution we discuss how a set of RestAPIs can be used to access a dedicated service based on INFN Cloud to monitor and possibly coordinate multiple training instances, with gradient-less optimization techniques, via simple HTTP requests. The service, named Hopaas (Hyperparameter OPtimization As A Service), is made of web interface and sets of APIs implemented with a FastAPI back-end running through Uvicorn and NGINX in a virtual instance of INFN Cloud. The optimization algorithms are currently based on Bayesian techniques as provided by Optuna. A Python front-end is also made available for quick prototyping. We present applications to hyperparameter optimization campaigns performed combining private, INFN Cloud and CINECA resources.

READ FULL TEXT

page 3

page 4

02/26/2020

PHS: A Toolbox for Parellel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...
10/08/2018

CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms

Many hyperparameter optimization (HyperOpt) methods assume restricted co...
02/26/2020

PHS: A Toolbox for Parallel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...
06/03/2020

A Scalable and Cloud-Native Hyperparameter Tuning System

In this paper, we introduce Katib: a scalable, cloud-native, and product...
09/07/2019

Transferable Neural Processes for Hyperparameter Optimization

Automated machine learning aims to automate the whole process of machine...
11/29/2022

Fast Hyperparameter Tuning for Ising Machines

In this paper, we propose a novel technique to accelerate Ising machines...
12/19/2018

Orchestrate: Infrastructure for Enabling Parallelism during Hyperparameter Optimization

Two key factors dominate the development of effective production grade m...

Code Repositories

poster-acat2022-hopaas

Repository to generate the HTML poster about Hopaas for ACAT 2022 based on the template designed by @cpitclaudel


view repo

Please sign up or login with your details

Forgot password? Click here to reset