PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series

08/02/2021
by   Jeha Paul, et al.
0

Realistic synthetic time series data of sufficient length enables practical applications in time series modeling tasks, such as forecasting, but remains a challenge. In this paper we present PSA-GAN, a generative adversarial network (GAN) that generates long time series samples of high quality using progressive growing of GANs and self-attention. We show that PSA-GAN can be used to reduce the error in two downstream forecasting tasks over baselines that only use real data. We also introduce a Frechet-Inception Distance-like score, Context-FID, assessing the quality of synthetic time series samples. In our downstream tasks, we find that the lowest scoring models correspond to the best-performing ones. Therefore, Context-FID could be a useful tool to develop time series GAN models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2023

GAT-GAN : A Graph-Attention-based Time-Series Generative Adversarial Network

Generative Adversarial Networks (GANs) have proven to be a powerful tool...
research
07/21/2023

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting

Diffusion models have achieved state-of-the-art performance in generativ...
research
05/27/2022

Group GAN

Generating multivariate time series is a promising approach for sharing ...
research
10/25/2022

Mitigating Health Data Poverty: Generative Approaches versus Resampling for Time-series Clinical Data

Several approaches have been developed to mitigate algorithmic bias stem...
research
06/30/2020

Conditional GAN for timeseries generation

It is abundantly clear that time dependent data is a vital source of inf...
research
03/02/2020

Subadditivity of Probability Divergences on Bayes-Nets with Applications to Time Series GANs

GANs for time series data often use sliding windows or self-attention to...
research
06/18/2021

A Unified Generative Adversarial Network Training via Self-Labeling and Self-Attention

We propose a novel GAN training scheme that can handle any level of labe...

Please sign up or login with your details

Forgot password? Click here to reset