HNAS: Hierarchical Neural Architecture Search on Mobile Devices

05/15/2020
by   Xin Xia, et al.
0

Neural Architecture Search (NAS) has attracted growing interest. To reduce the search cost, recent work has explored weight sharing across models and made major progress in One-Shot NAS. However, it has been observed that a model with higher one-shot model accuracy does not necessarily perform better when stand-alone trained. To address this issue, in this paper, we propose a new method, named Hierarchical Neural Architecture Search (HNAS). Unlike previous approaches where the same operation search space is shared by all the layers in the supernet, we formulate a hierarchical search strategy based on operation pruning and build a layer-wise operation search space. In this way, HNAS can automatically select the operations for each layer. During the search, we also take the hardware platform constraints into consideration for efficient neural network model deployment. Extensive experiments on ImageNet show that under mobile latency constraint, our models consistently outperform state-of-the-art models both designed manually and generated automatically by NAS methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset