Fractional moment-preserving initialization schemes for training fully-connected neural networks

05/25/2020
by   Mert Gurbuzbalaban, et al.
0

A common approach to initialization in deep neural networks is to sample the network weights from a Gaussian distribution to preserve the variance of preactivations. On the other hand, recent research shows that for a large number of deep neural networks, training process can often lead to non-Gaussianity and heavy tails in the distribution of the network weights, where the weights will not have a finite variance but rather have a (non-integer) fractional moment of order s with s<2. Motivated by this fact, we develop initialization schemes for fully connected feed-forward networks that can provably preserve any given moment of order s∈ (0,2) for ReLU, Leaky ReLU, Randomized Leaky ReLU and linear activations. The proposed strategy do not have an extra cost during the training procedure. We also show through numerical experiments that our initialization can improve the training and test performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset