DAS: Neural Architecture Search via Distinguishing Activation Score

12/23/2022
by   Yuqiao Liu, et al.
0

Neural Architecture Search (NAS) is an automatic technique that can search for well-performed architectures for a specific task. Although NAS surpasses human-designed architecture in many fields, the high computational cost of architecture evaluation it requires hinders its development. A feasible solution is to directly evaluate some metrics in the initial stage of the architecture without any training. NAS without training (WOT) score is such a metric, which estimates the final trained accuracy of the architecture through the ability to distinguish different inputs in the activation layer. However, WOT score is not an atomic metric, meaning that it does not represent a fundamental indicator of the architecture. The contributions of this paper are in three folds. First, we decouple WOT into two atomic metrics which represent the distinguishing ability of the network and the number of activation units, and explore better combination rules named (Distinguishing Activation Score) DAS. We prove the correctness of decoupling theoretically and confirmed the effectiveness of the rules experimentally. Second, in order to improve the prediction accuracy of DAS to meet practical search requirements, we propose a fast training strategy. When DAS is used in combination with the fast training strategy, it yields more improvements. Third, we propose a dataset called Darts-training-bench (DTB), which fills the gap that no training states of architecture in existing datasets. Our proposed method has 1.04× - 1.56× improvements on NAS-Bench-101, Network Design Spaces, and the proposed DTB.

READ FULL TEXT

page 1

page 4

page 6

research
06/17/2019

Sample-Efficient Neural Architecture Search by Learning Action Space

Neural Architecture Search (NAS) has emerged as a promising technique fo...
research
02/09/2023

Light and Accurate: Neural Architecture Search via Two Constant Shared Weights Initialisations

In recent years, zero-cost proxies are gaining ground in neural architec...
research
05/05/2023

Neural Architecture Search for Intel Movidius VPU

Hardware-aware Neural Architecture Search (NAS) technologies have been p...
research
06/01/2023

Training-free Neural Architecture Search for RNNs and Transformers

Neural architecture search (NAS) has allowed for the automatic creation ...
research
06/17/2022

FreeREA: Training-Free Evolution-based Architecture Search

In the last decade, most research in Machine Learning contributed to the...
research
01/24/2022

Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search

Neural architecture search (NAS) has gained immense popularity owing to ...
research
02/03/2021

Learning Diverse-Structured Networks for Adversarial Robustness

In adversarial training (AT), the main focus has been the objective and ...

Please sign up or login with your details

Forgot password? Click here to reset