Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition

01/23/2020
by   Lars Grüne, et al.
0

We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset