aSTDP: A More Biologically Plausible Learning

05/22/2022
by   Shiyuan Li, et al.
0

Spike-timing dependent plasticity in biological neural networks has been proven to be important during biological learning process. On the other hand, artificial neural networks use a different way to learn, such as Back-Propagation or Contrastive Hebbian Learning. In this work we introduce approximate STDP, a new neural networks learning framework more similar to the biological learning process. It uses only STDP rules for supervised and unsupervised learning, every neuron distributed learn patterns and don' t need a global loss or other supervised information. We also use a numerical way to approximate the derivatives of each neuron in order to better use SDTP learning and use the derivatives to set a target for neurons to accelerate training and testing process. The framework can make predictions or generate patterns in one model without additional configuration. Finally, we verified our framework on MNIST dataset for classification and generation tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro