ICDARTS: Improving the Stability and Performance of Cyclic DARTS

09/01/2023
by   Emily Herron, et al.
0

This work introduces improvements to the stability and generalizability of Cyclic DARTS (CDARTS). CDARTS is a Differentiable Architecture Search (DARTS)-based approach to neural architecture search (NAS) that uses a cyclic feedback mechanism to train search and evaluation networks concurrently. This training protocol aims to optimize the search process by enforcing that the search and evaluation networks produce similar outputs. However, CDARTS introduces a loss function for the evaluation network that is dependent on the search network. The dissimilarity between the loss functions used by the evaluation networks during the search and retraining phases results in a search-phase evaluation network that is a sub-optimal proxy for the final evaluation network that is utilized during retraining. We present ICDARTS, a revised approach that eliminates the dependency of the evaluation network weights upon those of the search network, along with a modified process for discretizing the search network's zero operations that allows these operations to be retained in the final evaluation networks. We pair the results of these changes with ablation studies on ICDARTS' algorithm and network template. Finally, we explore methods for expanding the search space of ICDARTS by expanding its operation set and exploring alternate methods for discretizing its continuous search cells. These experiments resulted in networks with improved generalizability and the implementation of a novel method for incorporating a dynamic search space into ICDARTS.

READ FULL TEXT
research
06/18/2020

Cyclic Differentiable Architecture Search

Recently, differentiable architecture search has draw great attention du...
research
04/08/2019

ASAP: Architecture Search, Anneal and Prune

Automatic methods for Neural Architecture Search (NAS) have been shown t...
research
04/13/2022

De-IReps: Searching for improved Re-parameterizing Architecture based on Differentiable Evolution Strategy

In recent years, neural architecture search (NAS) has shown great compet...
research
03/07/2021

Efficient Model Performance Estimation via Feature Histories

An important step in the task of neural network design, such as hyper-pa...
research
05/11/2023

Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

Continuous Ant-based Topology Search (CANTS) is a previously introduced ...
research
08/25/2021

iDARTS: Improving DARTS by Node Normalization and Decorrelation Discretization

Differentiable ARchiTecture Search (DARTS) uses a continuous relaxation ...
research
09/05/2023

Extended Symmetry Preserving Attention Networks for LHC Analysis

Reconstructing unstable heavy particles requires sophisticated technique...

Please sign up or login with your details

Forgot password? Click here to reset