Readouts for Echo-state Networks Built using Locally Regularized Orthogonal Forward Regression

by   Ján Dolinský, et al.

Echo state network (ESN) is viewed as a temporal non-orthogonal expansion with pseudo-random parameters. Such expansions naturally give rise to regressors of various relevance to a teacher output. We illustrate that often only a certain amount of the generated echo-regressors effectively explain the variance of the teacher output and also that sole local regularization is not able to provide in-depth information concerning the importance of the generated regressors. The importance is therefore determined by a joint calculation of the individual variance contributions and Bayesian relevance using locally regularized orthogonal forward regression (LROFR) algorithm. This information can be advantageously used in a variety of ways for an in-depth analysis of an ESN structure and its state-space parameters in relation to the unknown dynamics of the underlying problem. We present locally regularized linear readout built using LROFR. The readout may have a different dimensionality than an ESN model itself, and besides improving robustness and accuracy of an ESN it relates the echo-regressors to different features of the training data and may determine what type of an additional readout is suitable for a task at hand. Moreover, as flexibility of the linear readout has limitations and might sometimes be insufficient for certain tasks, we also present a radial basis function (RBF) readout built using LROFR. It is a flexible and parsimonious readout with excellent generalization abilities and is a viable alternative to readouts based on a feed-forward neural network (FFNN) or an RBF net built using relevance vector machine (RVM).


page 1

page 2

page 3

page 4


A Bayesian regularized feed-forward neural network model for conductivity prediction of PS/MWCNT nanocomposite film coatings

In our present work, a multi-layered feed-forward neural network (FFNN) ...

Global quantitative robustness of regression feed-forward neural networks

Neural networks are an indispensable model class for many complex learni...

Feed-forward Uncertainty Propagation in Belief and Neural Networks

We propose a feed-forward inference method applicable to belief and neur...

Learning from Matured Dumb Teacher for Fine Generalization

The flexibility of decision boundaries in neural networks that are ungui...

Multi-Task Learning for Multi-Dimensional Regression: Application to Luminescence Sensing

The classical approach to non-linear regression in physics, is to take a...

A theoretical framework for deep locally connected ReLU network

Understanding theoretical properties of deep and locally connected nonli...

Please sign up or login with your details

Forgot password? Click here to reset