Efficient Model Performance Estimation via Feature Histories

03/07/2021
by   Shengcao Cao, et al.
0

An important step in the task of neural network design, such as hyper-parameter optimization (HPO) or neural architecture search (NAS), is the evaluation of a candidate model's performance. Given fixed computational resources, one can either invest more time training each model to obtain more accurate estimates of final performance, or spend more time exploring a greater variety of models in the configuration space. In this work, we aim to optimize this exploration-exploitation trade-off in the context of HPO and NAS for image classification by accurately approximating a model's maximal performance early in the training process. In contrast to recent accelerated NAS methods customized for certain search spaces, e.g., requiring the search space to be differentiable, our method is flexible and imposes almost no constraints on the search space. Our method uses the evolution history of features of a network during the early stages of training to build a proxy classifier that matches the peak performance of the network under consideration. We show that our method can be combined with multiple search algorithms to find better solutions to a wide range of tasks in HPO and NAS. Using a sampling-based search algorithm and parallel computing, our method can find an architecture which is better than DARTS and with an 80

READ FULL TEXT
research
06/05/2022

Search Space Adaptation for Differentiable Neural Architecture Search in Image Classification

As deep neural networks achieve unprecedented performance in various tas...
research
01/02/2020

NAS-Bench-102: Extending the Scope of Reproducible Neural Architecture Search

Neural architecture search (NAS) has achieved breakthrough success in a ...
research
03/22/2021

AutoSpace: Neural Architecture Search with Less Human Interference

Current neural architecture search (NAS) algorithms still require expert...
research
05/20/2020

Rethinking Performance Estimation in Neural Architecture Search

Neural architecture search (NAS) remains a challenging problem, which is...
research
09/01/2023

ICDARTS: Improving the Stability and Performance of Cyclic DARTS

This work introduces improvements to the stability and generalizability ...
research
10/12/2021

On the Security Risks of AutoML

Neural Architecture Search (NAS) represents an emerging machine learning...
research
08/20/2021

Lessons from the Clustering Analysis of a Search Space: A Centroid-based Approach to Initializing NAS

Lots of effort in neural architecture search (NAS) research has been ded...

Please sign up or login with your details

Forgot password? Click here to reset