Reservoir Computers Modal Decomposition and Optimization
The topology of a network associated with a reservoir computer is often taken so that the connectivity and the weights are chosen randomly. Optimization is hardly considered as the parameter space is typically too large. Here we investigate this problem for a class of reservoir computers for which we obtain a decomposition of the reservoir dynamics into modes, which can be computed independently of one another. Each mode depends on an eigenvalue of the network adjacency matrix. We then take a parametric approach in which the eigenvalues are parameters that can be appropriately designed and optimized. In addition, we introduce the application of a time shift to each individual mode. We show that manipulations of the individual modes, either in terms of the eigenvalues or the time shifts, can lead to dramatic reductions in the training error.
READ FULL TEXT