HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural Networks

by   Albert Thie, et al.

Computational chemistry has become an important tool to predict and understand molecular properties and reactions. Even though recent years have seen a significant growth in new algorithms and computational methods that speed up quantum chemical calculations, the bottleneck for trajectory-based methods to study photoinduced processes is still the huge number of electronic structure calculations. In this work, we present an innovative solution, in which the amount of electronic structure calculations is drastically reduced, by employing machine learning algorithms and methods borrowed from the realm of artificial intelligence. However, applying these algorithms effectively requires finding optimal hyperparameters, which remains a challenge itself. Here we present an automated user-friendly framework, HOAX, to perform the hyperparameter optimization for neural networks, which bypasses the need for a lengthy manual process. The neural network generated potential energy surfaces (PESs) reduces the computational costs compared to the ab initio-based PESs. We perform a comparative investigation on the performance of different hyperparameter optimiziation algorithms, namely grid search, simulated annealing, genetic algorithm, and bayesian optimizer in finding the optimal hyperparameters necessary for constructing the well-performing neural network in order to fit the PESs of small organic molecules. Our results show that this automated toolkit not only facilitate a straightforward way to perform the hyperparameter optimization but also the resulting neural networks-based generated PESs are in reasonable agreement with the ab initio-based PESs.


A Population-based Hybrid Approach to Hyperparameter Optimization for Neural Networks

In recent years, large amounts of data have been generated, and computer...

Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks

Most learning algorithms require the practitioner to manually set the va...

Autoregressive neural-network wavefunctions for ab initio quantum chemistry

Performing electronic structure calculations is a canonical many-body pr...

Efficient Hyperparameter Optimization in Deep Learning Using a Variable Length Genetic Algorithm

Convolutional Neural Networks (CNN) have gained great success in many ar...

Goal-Oriented Sensitivity Analysis of Hyperparameters in Deep Learning

Tackling new machine learning problems with neural networks always means...

Tabular Benchmarks for Joint Architecture and Hyperparameter Optimization

Due to the high computational demands executing a rigorous comparison be...

Multi-step Planning for Automated Hyperparameter Optimization with OptFormer

As machine learning permeates more industries and models become more exp...

Please sign up or login with your details

Forgot password? Click here to reset