Multi-Source Unsupervised Hyperparameter Optimization

by   Masahiro Nomura, et al.
CyberAgent, Inc.
Tokyo Institute of Technology

How can we conduct efficient hyperparameter optimization for a completely new task? In this work, we consider a novel setting, where we search for the optimal hyperparameters for a target task of interest using only unlabeled target task and somewhat relevant source task datasets. In this setting, it is essential to estimate the ground-truth target task objective using only the available information. We propose estimators to unbiasedly approximate the ground-truth with a desirable variance property. Building on these estimators, we provide a general and tractable hyperparameter optimization procedure for our setting. The experimental evaluations demonstrate that the proposed framework broadens the applications of automated hyperparameter optimization.


page 1

page 2

page 3

page 4


PHS: A Toolbox for Parellel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...

Multi-step Planning for Automated Hyperparameter Optimization with OptFormer

As machine learning permeates more industries and models become more exp...

Applying Semi-Automated Hyperparameter Tuning for Clustering Algorithms

When approaching a clustering problem, choosing the right clustering alg...

Model Agnostic Conformal Hyperparameter Optimization

Several novel frameworks for hyperparameter search have emerged in the l...

Auto-Model: Utilizing Research Papers and HPO Techniques to Deal with the CASH problem

In many fields, a mass of algorithms with completely different hyperpara...

Hyperparameter-Free Losses for Model-Based Monocular Reconstruction

This work proposes novel hyperparameter-free losses for single view 3D r...

PHS: A Toolbox for Parallel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...

Please sign up or login with your details

Forgot password? Click here to reset