Efficient Differentiable Neural Architecture Search with Meta Kernels

by   Shoufa Chen, et al.

The searching procedure of neural architecture search (NAS) is notoriously time consuming and cost prohibitive.To make the search space continuous, most existing gradient-based NAS methods relax the categorical choice of a particular operation to a softmax over all possible operations and calculate the weighted sum of multiple features, resulting in a large memory requirement and a huge computation burden. In this work, we propose an efficient and novel search strategy with meta kernels. We directly encode the supernet from the perspective on convolution kernels and "shrink" multiple convolution kernel candidates into a single one before these candidates operate on the input feature. In this way, only a single feature is generated between two intermediate nodes. The memory for storing intermediate features and the resource budget for conducting convolution operations are both reduced remarkably. Despite high efficiency, our search strategy can search in a more fine-grained way than existing works and increases the capacity for representing possible networks. We demonstrate the effectiveness of our search strategy by conducting extensive experiments. Specifically, our method achieves 77.0 outperforming both EfficientNet and MobileNetV3 under the same FLOPs constraints. Compared to models discovered by the start-of-the-art NAS method, our method achieves the same (sometimes even better) performance, while faster by three orders of magnitude.


page 1

page 2

page 3

page 4


Automatic Subspace Evoking for Efficient Neural Architecture Search

Neural Architecture Search (NAS) aims to automatically find effective ar...

RankNAS: Efficient Neural Architecture Search by Pairwise Ranking

This paper addresses the efficiency challenge of Neural Architecture Sea...

DDNAS: Discretized Differentiable Neural Architecture Search for Text Classification

Neural Architecture Search (NAS) has shown promising capability in learn...

Canvas: End-to-End Kernel Architecture Search in Neural Networks

The demands for higher performance and accuracy in neural networks (NNs)...

C2FNAS: Coarse-to-Fine Neural Architecture Search for 3D Medical Image Segmentation

3D convolution neural networks (CNN) have been proved very successful in...

Neural Architecture Search for Efficient Uncalibrated Deep Photometric Stereo

We present an automated machine learning approach for uncalibrated photo...

GreedyNASv2: Greedier Search with a Greedy Path Filter

Training a good supernet in one-shot NAS methods is difficult since the ...

Please sign up or login with your details

Forgot password? Click here to reset