Reservoir kernels and Volterra series

12/30/2022
by   Lukas Gonon, et al.
0

A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space. This kernel is built using the reservoir functional associated with a state-space representation of the Volterra series expansion available for any analytic fading memory filter. It is hence called the Volterra reservoir kernel. Even though the state-space representation and the corresponding reservoir feature map are defined on an infinite-dimensional tensor algebra space, the kernel map is characterized by explicit recursions that are readily computable for specific data sets when employed in estimation problems using the representer theorem. We showcase the performance of the Volterra reservoir kernel in a popular data science application in relation to bitcoin price prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/16/2019

Differentiable reservoir computing

Much effort has been devoted in the last two decades to characterize the...
07/15/2019

Dynamical Systems as Temporal Feature Spaces

Parameterized state space models in the form of recurrent networks are o...
08/21/2023

Simple Cycle Reservoirs are Universal

Reservoir computation models form a subclass of recurrent neural network...
09/17/2020

Discrete-time signatures and randomness in reservoir computing

A new explanation of geometric nature of the reservoir computing phenome...
07/07/2018

Reservoir Computing Universality With Stochastic Inputs

The universal approximation properties with respect to L ^p -type criter...
04/02/2023

Infinite-dimensional reservoir computing

Reservoir computing approximation and generalization bounds are proved f...

Please sign up or login with your details

Forgot password? Click here to reset