A Study of Genetic Algorithms for Hyperparameter Optimization of Neural Networks in Machine Translation

by   Keshav Ganapathy, et al.

With neural networks having demonstrated their versatility and benefits, the need for their optimal performance is as prevalent as ever. A defining characteristic, hyperparameters, can greatly affect its performance. Thus engineers go through a process, tuning, to identify and implement optimal hyperparameters. That being said, excess amounts of manual effort are required for tuning network architectures, training configurations, and preprocessing settings such as Byte Pair Encoding (BPE). In this study, we propose an automatic tuning method modeled after Darwin's Survival of the Fittest Theory via a Genetic Algorithm (GA). Research results show that the proposed method, a GA, outperforms a random selection of hyperparameters.


page 1

page 2

page 3

page 4


Recombination of Artificial Neural Networks

We propose a genetic algorithm (GA) for hyperparameter optimization of a...

A Genetic Algorithm with Tree-structured Mutation for Hyperparameter Optimisation of Graph Neural Networks

In recent years, graph neural networks (GNNs) have gained increasing att...

Prediction and optimization of mechanical properties of composites using convolutional neural networks

In this paper, we develop a convolutional neural network model to predic...

Efficient Hyperparameter Optimization in Deep Learning Using a Variable Length Genetic Algorithm

Convolutional Neural Networks (CNN) have gained great success in many ar...

FOGA: Flag Optimization with Genetic Algorithm

Recently, program autotuning has become very popular especially in embed...

Automatic Machine Learning for Multi-Receiver CNN Technology Classifiers

Convolutional Neural Networks (CNNs) are one of the most studied family ...

Please sign up or login with your details

Forgot password? Click here to reset