Neural Architecture Generator Optimization

04/03/2020
by   Binxin Ru, et al.
0

Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention. An over-reliance on expert knowledge in the search space design has however led to increased performance (local optima) without significant architectural breakthroughs, thus preventing truly novel solutions from being reached. In this work we propose 1) to cast NAS as a problem of finding the optimal network generator and 2) a new, hierarchical and graph-based search space capable of representing an extremely large variety of network types, yet only requiring few continuous hyper-parameters. This greatly reduces the dimensionality of the problem, enabling the effective use of Bayesian Optimisation as a search strategy. At the same time, we expand the range of valid architectures, motivating a multi-objective learning approach. We demonstrate the effectiveness of our strategy on six benchmark datasets and show that our search space generates extremely lightweight yet highly competitive models illustrating the benefits of a NAS approach that optimises over network generator selection.

READ FULL TEXT
research
10/31/2022

Automatic Subspace Evoking for Efficient Neural Architecture Search

Neural Architecture Search (NAS) aims to automatically find effective ar...
research
06/05/2020

AutoHAS: Differentiable Hyper-parameter and Architecture Search

Neural Architecture Search (NAS) has achieved significant progress in pu...
research
08/26/2019

On the Bounds of Function Approximations

Within machine learning, the subfield of Neural Architecture Search (NAS...
research
11/21/2020

Continuous Ant-Based Neural Topology Search

This work introduces a novel, nature-inspired neural architecture search...
research
12/31/2019

Scalable NAS with Factorizable Architectural Parameters

Neural architecture search (NAS) is an emerging topic in machine learnin...
research
04/23/2023

HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel Neural Architecture Search

Recent neural architecture search (NAS) based approaches have made great...
research
09/22/2020

AutoRC: Improving BERT Based Relation Classification Models via Architecture Search

Although BERT based relation classification (RC) models have achieved si...

Please sign up or login with your details

Forgot password? Click here to reset