GBT: Two-stage transformer framework for non-stationary time series forecasting

07/17/2023
by   Li Shen, et al.
0

This paper shows that time series forecasting Transformer (TSFT) suffers from severe over-fitting problem caused by improper initialization method of unknown decoder inputs, esp. when handling non-stationary time series. Based on this observation, we propose GBT, a novel two-stage Transformer framework with Good Beginning. It decouples the prediction process of TSFT into two stages, including Auto-Regression stage and Self-Regression stage to tackle the problem of different statistical properties between input and prediction sequences.Prediction results of Auto-Regression stage serve as a Good Beginning, i.e., a better initialization for inputs of Self-Regression stage. We also propose Error Score Modification module to further enhance the forecasting capability of the Self-Regression stage in GBT. Extensive experiments on seven benchmark datasets demonstrate that GBT outperforms SOTA TSFTs (FEDformer, Pyraformer, ETSformer, etc.) and many other forecasting models (SCINet, N-HiTS, etc.) with only canonical attention and convolution while owning less time and space complexity. It is also general enough to couple with these models to strengthen their forecasting capability. The source code is available at: https://github.com/OrigamiSL/GBT

READ FULL TEXT

page 2

page 5

page 6

page 7

page 18

page 21

page 26

research
05/28/2022

Non-stationary Transformers: Rethinking the Stationarity in Time Series Forecasting

Transformers have shown great power in time series forecasting due to th...
research
07/22/2022

Respecting Time Series Properties Makes Deep Time Series Forecasting Perfect

How to handle time features shall be the core question of any time serie...
research
02/04/2022

Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary Time-Series

Real-world time-series datasets often violate the assumptions of standar...
research
06/19/2023

FDNet: Focal Decomposed Network for Efficient, Robust and Practical Time Series Forecasting

This paper presents FDNet: a Focal Decomposed Network for efficient, rob...
research
07/13/2022

DeepTIMe: Deep Time-Index Meta-Learning for Non-Stationary Time-Series Forecasting

Deep learning has been actively applied to time-series forecasting, lead...
research
08/30/2022

Persistence Initialization: A novel adaptation of the Transformer architecture for Time Series Forecasting

Time series forecasting is an important problem, with many real world ap...
research
10/07/2022

EmbryosFormer: Deformable Transformer and Collaborative Encoding-Decoding for Embryos Stage Development Classification

The timing of cell divisions in early embryos during the In-Vitro Fertil...

Please sign up or login with your details

Forgot password? Click here to reset