Input correlations impede suppression of chaos and learning in balanced rate networks

01/24/2022
by   Rainer Engelken, et al.
4

Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A unique feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far easier to suppress chaos with independent inputs into each neuron than through common input. To study this phenomenon we develop a non-stationary dynamic mean-field theory that determines how the activity statistics and largest Lyapunov exponent depend on frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We also show that uncorrelated inputs facilitate learning in balanced networks.

READ FULL TEXT
research
12/26/2017

Chaos-guided Input Structuring for Improved Learning in Recurrent Neural Networks

Anatomical studies demonstrate that brain reformats input information to...
research
08/27/2015

Continuous parameter working memory in a balanced chaotic neural network

It has been proposed that neural noise in the cortex arises from chaotic...
research
10/31/2012

Mean Field Theory of Dynamical Systems Driven by External Signals

Dynamical systems driven by strong external signals are ubiquituous in n...
research
12/29/2018

Training dynamically balanced excitatory-inhibitory networks

The construction of biologically plausible models of neural circuits is ...
research
05/01/2022

Dynamic modeling of spike count data with Conway-Maxwell Poisson variability

In many areas of the brain, neural spiking activity covaries with featur...
research
06/22/2019

Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network

Memories in neural system are shaped through the interplay of neural and...
research
06/25/2020

Predictive coding in balanced neural networks with noise, chaos and delays

Biological neural networks face a formidable task: performing reliable c...

Please sign up or login with your details

Forgot password? Click here to reset