Deep Latent State Space Models for Time-Series Generation

12/24/2022
by   Linqi Zhou, et al.
0

Methods based on ordinary differential equations (ODEs) are widely used to build generative models of time-series. In addition to high computational overhead due to explicitly computing hidden states recurrence, existing ODE-based models fall short in learning sequence data with sharp transitions - common in many real-world systems - due to numerical challenges during optimization. In this work, we propose LS4, a generative model for sequences with latent variables evolving according to a state space ODE to increase modeling capacity. Inspired by recent deep state space models (S4), we achieve speedups by leveraging a convolutional representation of LS4 which bypasses the explicit evaluation of hidden states. We show that LS4 significantly outperforms previous continuous-time generative models in terms of marginal distribution, classification, and prediction scores on real-world datasets in the Monash Forecasting Repository, and is capable of modeling highly stochastic data with sharp temporal transitions. LS4 sets state-of-the-art for continuous-time latent generative models, with significant improvement of mean squared error and tighter variational lower bounds on irregularly-sampled datasets, while also being x100 faster than other baselines on long sequences.

READ FULL TEXT

page 14

page 15

page 17

page 19

page 20

research
10/26/2021

Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers

Recurrent neural networks (RNNs), temporal convolutions, and neural diff...
research
01/26/2023

Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series

Learning accurate predictive models of real-world dynamic phenomena (e.g...
research
12/07/2022

CrossPyramid: Neural Ordinary Differential Equations Architecture for Partially-observed Time-series

Ordinary Differential Equations (ODE)-based models have become popular f...
research
03/03/2023

Deep Momentum Multi-Marginal Schrödinger Bridge

Reconstructing population dynamics using only samples from distributions...
research
06/11/2020

Learning Continuous-Time Dynamics by Stochastic Differential Networks

Learning continuous-time stochastic dynamics from sparse or irregular ob...
research
12/04/2019

Deep Physiological State Space Model for Clinical Forecasting

Clinical forecasting based on electronic medical records (EMR) can uncov...
research
09/12/2013

Temporal Autoencoding Improves Generative Models of Time Series

Restricted Boltzmann Machines (RBMs) are generative models which can lea...

Please sign up or login with your details

Forgot password? Click here to reset