Theory and Algorithms for Forecasting Time Series

03/15/2018
by   Vitaly Kuznetsov, et al.
0

We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumptions. We also also provide novel analysis of stable time series forecasting algorithm using this new notion of discrepancy that we introduce. We use our learning bounds to devise new algorithms for non-stationary time series forecasting for which we report some preliminary experimental results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2022

DynaConF: Dynamic Forecasting of Non-Stationary Time-Series

Deep learning models have shown impressive results in a variety of time ...
research
06/03/2011

Rademacher complexity of stationary sequences

We show how to control the generalization error of time series models wh...
research
03/29/2022

Split Conformal Prediction for Dependent Data

Split conformal prediction is a popular tool to obtain predictive interv...
research
05/19/2012

New Analysis and Algorithm for Learning with Drifting Distributions

We present a new analysis of the problem of learning with drifting distr...
research
05/25/2023

SAMoSSA: Multivariate Singular Spectrum Analysis with Stochastic Autoregressive Noise

The well-established practice of time series analysis involves estimatin...
research
06/28/2022

Optimal Estimation of Generic Dynamics by Path-Dependent Neural Jump ODEs

This paper studies the problem of forecasting general stochastic process...
research
02/23/2022

Learning Fast and Slow for Online Time Series Forecasting

The fast adaptation capability of deep neural networks in non-stationary...

Please sign up or login with your details

Forgot password? Click here to reset