SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via Train-Free Metrics

04/20/2022
by   Rob Geada, et al.
0

Neural Architecture Search (NAS) algorithms are intended to remove the burden of manual neural network design, and have shown to be capable of designing excellent models for a variety of well-known problems. However, these algorithms require a variety of design parameters in the form of user configuration or hard-coded decisions which limit the variety of networks that can be discovered. This means that NAS algorithms do not eliminate model design tuning, they instead merely shift the burden of where that tuning needs to be applied. In this paper, we present SpiderNet, a hybrid differentiable-evolutionary and hardware-aware algorithm that rapidly and efficiently produces state-of-the-art networks. More importantly, SpiderNet is a proof-of-concept of a minimally-configured NAS algorithm; the majority of design choices seen in other algorithms are incorporated into SpiderNet's dynamically-evolving search space, minimizing the number of user choices to just two: reduction cell count and initial channel count. SpiderNet produces models highly-competitive with the state-of-the-art, and outperforms random search in accuracy, runtime, memory size, and parameter count.

READ FULL TEXT

page 10

page 11

research
11/24/2018

Evolutionary-Neural Hybrid Agents for Architecture Search

Neural Architecture Search has recently shown potential to automate the ...
research
08/30/2022

You Only Search Once: On Lightweight Differentiable Architecture Search for Resource-Constrained Embedded Platforms

Benefiting from the search efficiency, differentiable neural architectur...
research
01/01/2021

Neural Architecture Search via Combinatorial Multi-Armed Bandit

Neural Architecture Search (NAS) has gained significant popularity as an...
research
11/22/2020

FP-NAS: Fast Probabilistic Neural Architecture Search

Differential Neural Architecture Search (NAS) requires all layer choices...
research
02/28/2023

EvoPrompting: Language Models for Code-Level Neural Architecture Search

Given the recent impressive accomplishments of language models (LMs) for...
research
04/05/2019

Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours

Can we automatically design a Convolutional Network (ConvNet) with the h...
research
02/05/2018

Regularized Evolution for Image Classifier Architecture Search

The effort devoted to hand-crafting image classifiers has motivated the ...

Please sign up or login with your details

Forgot password? Click here to reset