Deep Recurrent Neural Networks for Time Series Prediction

07/22/2014
by   Sharat C. Prasad, et al.
0

Ability of deep networks to extract high level features and of recurrent networks to perform time-series inference have been studied. In view of universality of one hidden layer network at approximating functions under weak constraints, the benefit of multiple layers is to enlarge the space of dynamical systems approximated or, given the space, reduce the number of units required for a certain error. Traditionally shallow networks with manually engineered features are used, back-propagation extent is limited to one and attempt to choose a large number of hidden units to satisfy the Markov condition is made. In case of Markov models, it has been shown that many systems need to be modeled as higher order. In the present work, we present deep recurrent networks with longer backpropagation through time extent as a solution to modeling systems that are high order and to predicting ahead. We study epileptic seizure suppression electro-stimulator. Extraction of manually engineered complex features and prediction employing them has not allowed small low-power implementations as, to avoid possibility of surgery, extraction of any features that may be required has to be included. In this solution, a recurrent neural network performs both feature extraction and prediction. We prove analytically that adding hidden layers or increasing backpropagation extent increases the rate of decrease of approximation error. A Dynamic Programming (DP) training procedure employing matrix operations is derived. DP and use of matrix operations makes the procedure efficient particularly when using data-parallel computing. The simulation studies show the geometry of the parameter space, that the network learns the temporal structure, that parameters converge while model output displays same dynamic behavior as the system and greater than .99 Average Detection Rate on all real seizure data tried.

READ FULL TEXT

page 12

page 13

research
03/01/2022

Path sampling of recurrent neural networks by incorporating known physics

Recurrent neural networks have seen widespread use in modeling dynamical...
research
02/09/2018

Predictive Neural Networks

Recurrent neural networks are a powerful means to cope with time series....
research
05/05/2021

Reconstructing common latent input from time series with the mapper-coach network and error backpropagation

A two-module, feedforward neural network architecture called mapper-coac...
research
07/06/2019

Takens-inspired neuromorphic processor: a downsizing tool for random recurrent neural networks via feature extraction

We describe a new technique which minimizes the amount of neurons in the...
research
01/23/2019

Recurrent Neural Filters: Learning Independent Bayesian Filtering Steps for Time Series Prediction

Despite the recent popularity of deep generative state space models, few...
research
08/13/2018

DeepBase: Deep Inspection of Neural Networks

Although deep learning models perform remarkably across a range of tasks...

Please sign up or login with your details

Forgot password? Click here to reset