Removing numerical dispersion from linear evolution equations

06/22/2019
by   Jens Wittsten, et al.
0

In this paper we describe a method for removing the numerical errors in the modeling of linear evolution equations that are caused by approximating the time derivative by a finite difference operator. We prove that the method results in a solution with correct evolution throughout the entire lifespan. We also demonstrate the method on a model equation as well as on the simulation of elastic and viscoelastic wave propagation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset