A Dynamically Controlled Recurrent Neural Network for Modeling Dynamical Systems

by   Yiwei Fu, et al.

This work proposes a novel neural network architecture, called the Dynamically Controlled Recurrent Neural Network (DCRNN), specifically designed to model dynamical systems that are governed by ordinary differential equations (ODEs). The current state vectors of these types of dynamical systems only depend on their state-space models, along with the respective inputs and initial conditions. Long Short-Term Memory (LSTM) networks, which have proven to be very effective for memory-based tasks, may fail to model physical processes as they tend to memorize, rather than learn how to capture the information on the underlying dynamics. The proposed DCRNN includes learnable skip-connections across previously hidden states, and introduces a regularization term in the loss function by relying on Lyapunov stability theory. The regularizer enables the placement of eigenvalues of the transfer function induced by the DCRNN to desired values, thereby acting as an internal controller for the hidden state trajectory. The results show that, for forecasting a chaotic dynamical system, the DCRNN outperforms the LSTM in 100 out of 100 randomized experiments by reducing the mean squared error of the LSTM's forecasting by 80.0%± 3.0%.


page 1

page 2

page 3

page 4


Data-Driven Forecasting of High-Dimensional Chaotic Systems with Long-Short Term Memory Networks

We introduce a data-driven forecasting method for high dimensional, chao...

GP-HD: Using Genetic Programming to Generate Dynamical Systems Models for Health Care

The huge wealth of data in the health domain can be exploited to create ...

Reconstruction, forecasting, and stability of chaotic dynamics from partial data

The forecasting and computation of the stability of chaotic systems from...

Initializing LSTM internal states via manifold learning

We present an approach, based on learning an intrinsic data manifold, fo...

Inferring Dynamical Systems with Long-Range Dependencies through Line Attractor Regularization

Vanilla RNN with ReLU activation have a simple structure that is amenabl...

Leveraging arbitrary mobile sensor trajectories with shallow recurrent decoder networks for full-state reconstruction

Sensing is one of the most fundamental tasks for the monitoring, forecas...

Entanglement-Embedded Recurrent Network Architecture: Tensorized Latent State Propagation and Chaos Forecasting

Chaotic time series forecasting has been far less understood despite its...

Please sign up or login with your details

Forgot password? Click here to reset