GENNAPE: Towards Generalized Neural Architecture Performance Estimators

by   Keith G. Mills, et al.

Predicting neural architecture performance is a challenging task and is crucial to neural architecture design and search. Existing approaches either rely on neural performance predictors which are limited to modeling architectures in a predefined design space involving specific sets of operators and connection rules, and cannot generalize to unseen architectures, or resort to zero-cost proxies which are not always accurate. In this paper, we propose GENNAPE, a Generalized Neural Architecture Performance Estimator, which is pretrained on open neural architecture benchmarks, and aims to generalize to completely unseen architectures through combined innovations in network representation, contrastive pretraining, and fuzzy clustering-based predictor ensemble. Specifically, GENNAPE represents a given neural network as a Computation Graph (CG) of atomic operations which can model an arbitrary architecture. It first learns a graph encoder via Contrastive Learning to encourage network separation by topological features, and then trains multiple predictor heads, which are soft-aggregated according to the fuzzy membership of a neural network. Experiments show that GENNAPE pretrained on NAS-Bench-101 can achieve superior transferability to 5 different public neural network benchmarks, including NAS-Bench-201, NAS-Bench-301, MobileNet and ResNet families under no or minimum fine-tuning. We further introduce 3 challenging newly labelled neural network benchmarks: HiAML, Inception and Two-Path, which can concentrate in narrow accuracy ranges. Extensive experiments show that GENNAPE can correctly discern high-performance architectures in these families. Finally, when paired with a search algorithm, GENNAPE can find architectures that improve accuracy while reducing FLOPs on three families.


page 1

page 2

page 3

page 4


A General-Purpose Transferable Predictor for Neural Architecture Search

Understanding and modelling the performance of neural architectures is k...

Neural Architecture Ranker

Architecture ranking has recently been advocated to design an efficient ...

AIO-P: Expanding Neural Performance Predictors Beyond Image Classification

Evaluating neural network performance is critical to deep neural network...

DCLP: Neural Architecture Predictor with Curriculum Contrastive Learning

Neural predictors currently show great potential in the performance eval...

Contrastive Neural Architecture Search with Neural Architecture Comparators

One of the key steps in Neural Architecture Search (NAS) is to estimate ...

Weak NAS Predictors Are All You Need

Neural Architecture Search (NAS) finds the best network architecture by ...

Neural Architecture Retrieval

With the increasing number of new neural architecture designs and substa...

Please sign up or login with your details

Forgot password? Click here to reset