Surrogate Model Based Hyperparameter Tuning for Deep Learning with SPOT

by   Thomas Bartz-Beielstein, et al.

A surrogate model based hyperparameter tuning approach for deep learning is presented. This article demonstrates how the architecture-level parameters (hyperparameters) of deep learning models that were implemented in Keras/tensorflow can be optimized. The implementation of the tuning procedure is 100 few lines of code, existing R packages (tfruns and SPOT) can be combined to perform hyperparameter tuning. An elementary hyperparameter tuning task (neural network and the MNIST data) is used to exemplify this approach.


Hyperparameter Tuning Cookbook: A guide for scikit-learn, PyTorch, river, and spotPython

This document provides a comprehensive guide to hyperparameter tuning us...

Hyperparameter Optimization with Neural Network Pruning

Since the deep learning model is highly dependent on hyperparameters, hy...

HYPPO: A Surrogate-Based Multi-Level Parallelism Tool for Hyperparameter Optimization

We present a new software, HYPPO, that enables the automatic tuning of h...

LAMVI-2: A Visual Tool for Comparing and Tuning Word Embedding Models

Tuning machine learning models, particularly deep learning architectures...

How much progress have we made in neural network training? A New Evaluation Protocol for Benchmarking Optimizers

Many optimizers have been proposed for training deep neural networks, an...

Guided Hyperparameter Tuning Through Visualization and Inference

For deep learning practitioners, hyperparameter tuning for optimizing mo...

CrossedWires: A Dataset of Syntactically Equivalent but Semantically Disparate Deep Learning Models

The training of neural networks using different deep learning frameworks...

Please sign up or login with your details

Forgot password? Click here to reset