Scaling the leading accuracy of deep equivariant models to biomolecular simulations of realistic size

04/20/2023
by   Albert Musaelian, et al.
0

This work brings the leading accuracy, sample efficiency, and robustness of deep equivariant neural networks to the extreme computational scale. This is achieved through a combination of innovative model architecture, massive parallelization, and models and implementations optimized for efficient GPU utilization. The resulting Allegro architecture bridges the accuracy-speed tradeoff of atomistic simulations and enables description of dynamics in structures of unprecedented complexity at quantum fidelity. To illustrate the scalability of Allegro, we perform nanoseconds-long stable simulations of protein dynamics and scale up to a 44-million atom structure of a complete, all-atom, explicitly solvated HIV capsid on the Perlmutter supercomputer. We demonstrate excellent strong scaling up to 100 million atoms and 70 scaling to 5120 A100 GPUs.

READ FULL TEXT

page 3

page 6

page 8

research
03/14/2023

Allegro-Legato: Scalable, Fast, and Robust Neural-Network Quantum Molecular Dynamics via Sharpness-Aware Minimization

Neural-network quantum molecular dynamics (NNQMD) simulations based on m...
research
06/16/2020

Heterogeneous Parallelization and Acceleration of Molecular Dynamics Simulations in GROMACS

The introduction of accelerator devices such as graphics processing unit...
research
01/05/2022

Extending the limit of molecular dynamics with ab initio accuracy to 10 billion atoms

High-performance computing, together with a neural network model trained...
research
05/12/2020

Heterogeneous CPU/GPU co-execution of CFD simulations on the POWER9 architecture: Application to airplane aerodynamics

High fidelity Computational Fluid Dynamics simulations are generally ass...
research
11/12/2019

Mirheo: High-Performance Mesoscale Simulations for Microfluidics

The transport and manipulation of particles and cells in microfluidic de...
research
07/24/2018

An argument in favor of strong scaling for deep neural networks with small datasets

In recent years, with the popularization of deep learning frameworks and...
research
01/21/2023

SuperScaler: Supporting Flexible DNN Parallelization via a Unified Abstraction

With the growing model size, deep neural networks (DNN) are increasingly...

Please sign up or login with your details

Forgot password? Click here to reset