A Specialized Evolutionary Strategy Using Mean Absolute Error Random Sampling to Design Recurrent Neural Networks

09/04/2019
by   Andrés Camero, et al.
0

Recurrent neural networks have demonstrated to be good at solving prediction problems. However, finding a network that suits a problem is quite hard because of their high sensitivity to the hyperparameter configuration. Automatic hyperparameter optimization methods help to find the most suitable configuration, but they are not extensively adopted because of their high computational cost. In this work, we study the use of the mean absolute error random sampling to compare multiple-hidden-layer architectures and propose an evolutionary strategy-based algorithm that uses its results to optimize the configuration of a recurrent network. We empirically validate our proposal and show that it is possible to predict and compare the expected performance of a hyperparameter configuration in a low-cost way, as well as use these predictions to optimize the configuration of a recurrent network.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset