Orchestrate: Infrastructure for Enabling Parallelism during Hyperparameter Optimization

12/19/2018
by   Alexandra Johnson, et al.
0

Two key factors dominate the development of effective production grade machine learning models. First, it requires a local software implementation and iteration process. Second, it requires distributed infrastructure to efficiently conduct training and hyperparameter optimization. While modern machine learning frameworks are very effective at the former, practitioners are often left building ad hoc frameworks for the latter. We present SigOpt Orchestrate, a library for such simultaneous training in a cloud environment. We describe the motivating factors and resulting design of this library, feedback from initial testing, and future goals.

READ FULL TEXT
research
05/08/2020

Sherpa: Robust Hyperparameter Optimization for Machine Learning

Sherpa is a hyperparameter optimization library for machine learning mod...
research
05/22/2020

MANGO: A Python Library for Parallel Hyperparameter Tuning

Tuning hyperparameters for machine learning algorithms is a tedious task...
research
10/22/2018

ensmallen: a flexible C++ library for efficient function optimization

We present ensmallen, a fast and flexible C++ library for mathematical o...
research
07/13/2018

Tune: A Research Platform for Distributed Model Selection and Training

Modern machine learning algorithms are increasingly computationally dema...
research
10/08/2018

CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms

Many hyperparameter optimization (HyperOpt) methods assume restricted co...
research
01/13/2023

Hyperparameter Optimization as a Service on INFN Cloud

The simplest and often most effective way of parallelizing the training ...
research
03/03/2022

Why Do Machine Learning Practitioners Still Use Manual Tuning? A Qualitative Study

Current advanced hyperparameter optimization (HPO) methods, such as Baye...

Please sign up or login with your details

Forgot password? Click here to reset