On the Exploitation of Neuroevolutionary Information: Analyzing the Past for a More Efficient Future

by   Unai Garciarena, et al.

Neuroevolutionary algorithms, automatic searches of neural network structures by means of evolutionary techniques, are computationally costly procedures. In spite of this, due to the great performance provided by the architectures which are found, these methods are widely applied. The final outcome of neuroevolutionary processes is the best structure found during the search, and the rest of the procedure is commonly omitted in the literature. However, a good amount of residual information consisting of valuable knowledge that can be extracted is also produced during these searches. In this paper, we propose an approach that extracts this information from neuroevolutionary runs, and use it to build a metamodel that could positively impact future neural architecture searches. More specifically, by inspecting the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics (e.g., based on dense or convolutional layers), we propose a Bayesian network-based model which can be used to either find strong neural structures right away, conveniently initialize different structural searches for different problems, or help future optimization of structures of any type to keep finding increasingly better structures where uninformed methods get stuck into local optima.


page 4

page 7


Braid-based architecture search

In this article, we propose the approach to structural optimization of n...

Inductive Transfer for Neural Architecture Optimization

The recent advent of automated neural network architecture search led to...

DC and SA: Robust and Efficient Hyperparameter Optimization of Multi-subnetwork Deep Learning Models

We present two novel hyperparameter optimization strategies for optimiza...

Structural Health Monitoring Using Neural Network Based Vibrational System Identification

Composite fabrication technologies now provide the means for producing h...

Controlling Model Complexity in Probabilistic Model-Based Dynamic Optimization of Neural Network Structures

A method of simultaneously optimizing both the structure of neural netwo...

Autonomously and Simultaneously Refining Deep Neural Network Parameters by a Bi-Generative Adversarial Network Aided Genetic Algorithm

The choice of parameters, and the design of the network architecture are...

Mathematical Models for Local Sensing Hashes

As data volumes continue to grow, searches in data are becoming increasi...

Please sign up or login with your details

Forgot password? Click here to reset