A Dual Process Model for Optimizing Cross Entropy in Neural Networks

04/27/2021
by   Stefan Jaeger, et al.
0

Minimizing cross-entropy is a widely used method for training artificial neural networks. Many training procedures based on backpropagation use cross-entropy directly as their loss function. Instead, this theoretical essay investigates a dual process model with two processes, in which one process minimizes the Kullback-Leibler divergence while its dual counterpart minimizes the Shannon entropy. Postulating that learning consists of two dual processes complementing each other, the model defines an equilibrium state for both processes in which the loss function assumes its minimum. An advantage of the proposed model is that it allows deriving the optimal learning rate and momentum weight to update network weights for backpropagation. Furthermore, the model introduces the golden ratio and complex numbers as important new concepts in machine learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset