Universality and approximation bounds for echo state networks with random weights

06/12/2022
by   Zhen Li, et al.
0

We study the uniform approximation of echo state networks with randomly generated internal weights. These models, in which only the readout weights are optimized during training, have made empirical success in learning dynamical systems. We address the representational capacity of these models by showing that they are universal under weak conditions. Our main result gives a sufficient condition for the activation function and a sampling procedure for the internal weights so that echo state networks can approximate any continuous casual time-invariant operators with high probability. In particular, for ReLU activation, we quantify the approximation error of echo state networks for sufficiently regular operators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2020

Approximation Bounds for Random Neural Networks and Reservoir Systems

This work studies approximation based on single-hidden-layer feedforward...
research
05/14/2020

Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

Echo State Networks (ESNs) are a class of single-layer recurrent neural ...
research
04/04/2023

Measure theoretic results for approximation by neural networks with limited weights

In this paper, we study approximation properties of single hidden layer ...
research
05/20/2021

Neural networks with superexpressive activations and integer weights

An example of an activation function σ is given such that networks with ...
research
07/12/2020

Universal Approximation Power of Deep Neural Networks via Nonlinear Control Theory

In this paper, we explain the universal approximation capabilities of de...
research
10/15/2019

Neural tangent kernels, transportation mappings, and universal approximation

This paper establishes rates of universal approximation for the shallow ...
research
05/17/2022

Sharp asymptotics on the compression of two-layer neural networks

In this paper, we study the compression of a target two-layer neural net...

Please sign up or login with your details

Forgot password? Click here to reset