Discrete Gradient Flow Approximations of High Dimensional Evolution Partial Differential Equations via Deep Neural Networks

06/01/2022
by   Emmanuil H. Georgoulis, et al.
0

We consider the approximation of initial/boundary value problems involving, possibly high-dimensional, dissipative evolution partial differential equations (PDEs) using a deep neural network framework. More specifically, we first propose discrete gradient flow approximations based on non-standard Dirichlet energies for problems involving essential boundary conditions posed on bounded spatial domains. The imposition of the boundary conditions is realized weakly via non-standard functionals; the latter classically arise in the construction of Galerkin-type numerical methods and are often referred to as "Nitsche-type" methods. Moreover, inspired by the seminal work of Jordan, Kinderleher, and Otto (JKO) <cit.>, we consider the second class of discrete gradient flows for special classes of dissipative evolution PDE problems with non-essential boundary conditions. These JKO-type gradient flows are solved via deep neural network approximations. A key, distinct aspect of the proposed methods is that the discretization is constructed via a sequence of residual-type deep neural networks (DNN) corresponding to implicit time-stepping. As a result, a DNN represents the PDE problem solution at each time node. This approach offers several advantages in the training of each DNN. We present a series of numerical experiments which showcase the good performance of Dirichlet-type energy approximations for lower space dimensions and the excellent performance of the JKO-type energies for higher spatial dimensions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro