P-CRITICAL: A Reservoir Autoregulation Plasticity Rule for Neuromorphic Hardware
Backpropagation algorithms on recurrent artificial neural networks require an unfolding of accumulated states over time. These states must be kept in memory for an undefined period of time which is task-dependent. This paper uses the reservoir computing paradigm where an untrained recurrent neural network layer is used as a preprocessor stage to learn temporal and limited data. These so-called reservoirs require either extensive fine-tuning or neuroplasticity with unsupervised learning rules. We propose a new local plasticity rule named P-CRITICAL designed for automatic reservoir tuning that translates well to Intel's Loihi research chip, a recent neuromorphic processor. We compare our approach on well-known datasets from the machine learning community while using a spiking neuronal architecture. We observe an improved performance on tasks coming from various modalities without the need to tune parameters. Such algorithms could be a key to end-to-end energy-efficient neuromorphic-based machine learning on edge devices.
READ FULL TEXT