PHS: A Toolbox for Parallel Hyperparameter Search

02/26/2020
by   Peter Michael Habelitz, et al.
Fraunhofer
15

We introduce an open source python framework named PHS - Parallel Hyperparameter Search to enable hyperparameter optimization on numerous compute instances of any arbitrary python function. This is achieved with minimal modifications inside the target function. Possible applications appear in expensive to evaluate numerical computations which strongly depend on hyperparameters such as machine learning. Bayesian optimization is chosen as a sample efficient method to propose the next query set of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/26/2020

PHS: A Toolbox for Parellel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...
05/22/2020

MANGO: A Python Library for Parallel Hyperparameter Tuning

Tuning hyperparameters for machine learning algorithms is a tedious task...
12/11/2019

Bayesian Hyperparameter Optimization with BoTorch, GPyTorch and Ax

Deep learning models are full of hyperparameters, which are set manually...
01/13/2023

Hyperparameter Optimization as a Service on INFN Cloud

The simplest and often most effective way of parallelizing the training ...
06/18/2020

Multi-Source Unsupervised Hyperparameter Optimization

How can we conduct efficient hyperparameter optimization for a completel...
08/07/2023

HomOpt: A Homotopy-Based Hyperparameter Optimization Method

Machine learning has achieved remarkable success over the past couple of...
10/18/2019

Fully Parallel Hyperparameter Search: Reshaped Space-Filling

Space-filling designs such as scrambled-Hammersley, Latin Hypercube Samp...

Code Repositories

PHS

parallel hyperparameter search


view repo

Please sign up or login with your details

Forgot password? Click here to reset