Removing numerical dispersion from linear evolution equations
In this paper we describe a method for removing the numerical errors in the modeling of linear evolution equations that are caused by approximating the time derivative by a finite difference operator. We prove that the method results in a solution with correct evolution throughout the entire lifespan. We also demonstrate the method on a model equation as well as on the simulation of elastic and viscoelastic wave propagation.
READ FULL TEXT