Reservoir kernels and Volterra series

by   Lukas Gonon, et al.

A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space. This kernel is built using the reservoir functional associated with a state-space representation of the Volterra series expansion available for any analytic fading memory filter. It is hence called the Volterra reservoir kernel. Even though the state-space representation and the corresponding reservoir feature map are defined on an infinite-dimensional tensor algebra space, the kernel map is characterized by explicit recursions that are readily computable for specific data sets when employed in estimation problems using the representer theorem. We showcase the performance of the Volterra reservoir kernel in a popular data science application in relation to bitcoin price prediction.


page 1

page 2

page 3

page 4


Differentiable reservoir computing

Much effort has been devoted in the last two decades to characterize the...

Dynamical Systems as Temporal Feature Spaces

Parameterized state space models in the form of recurrent networks are o...

Simple Cycle Reservoirs are Universal

Reservoir computation models form a subclass of recurrent neural network...

Discrete-time signatures and randomness in reservoir computing

A new explanation of geometric nature of the reservoir computing phenome...

Reservoir Computing Universality With Stochastic Inputs

The universal approximation properties with respect to L ^p -type criter...

Infinite-dimensional reservoir computing

Reservoir computing approximation and generalization bounds are proved f...

Please sign up or login with your details

Forgot password? Click here to reset