Local Randomized Neural Networks with Discontinuous Galerkin Methods for Partial Differential Equations
Randomized neural networks (RNN) are a variation of neural networks in which the hidden-layer parameters are fixed to randomly assigned values and the output-layer parameters are obtained by solving a linear system by least squares. This improves the efficiency without degrading the accuracy of the neural network. In this paper, we combine the idea of the local RNN (LRNN) and the discontinuous Galerkin (DG) approach for solving partial differential equations. RNNs are used to approximate the solution on the subdomains, and the DG formulation is used to glue them together. Taking the Poisson problem as a model, we propose three numerical schemes and provide the convergence analyses. Then we extend the ideas to time-dependent problems. Taking the heat equation as a model, three space-time LRNN with DG formulations are proposed. Finally, we present numerical tests to demonstrate the performance of the methods developed herein. We compare the proposed methods with the finite element method and the usual DG method. The LRNN-DG methods can achieve better accuracy under the same degrees of freedom, signifying that this new approach has a great potential for solving partial differential equations.
READ FULL TEXT