EPNAS: Efficient Progressive Neural Architecture Search

07/07/2019
by   Yanqi Zhou, et al.
0

In this paper, we propose Efficient Progressive Neural Architecture Search (EPNAS), a neural architecture search (NAS) that efficiently handles large search space through a novel progressive search policy with performance prediction based on REINFORCE Williams.1992.PG. EPNAS is designed to search target networks in parallel, which is more scalable on parallel systems such as GPU/TPU clusters. More importantly, EPNAS can be generalized to architecture search with multiple resource constraints, , model size, compute complexity or intensity, which is crucial for deployment in widespread platforms such as mobile and cloud. We compare EPNAS against other state-of-the-art (SoTA) network architectures (, MobileNetV2 mobilenetv2) and efficient NAS algorithms (, ENAS pham2018efficient, and PNAS Liu2017b) on image recognition tasks using CIFAR10 and ImageNet. On both datasets, EPNAS is superior architecture searching speed and recognition accuracy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset