A Study on Binary Neural Networks Initialization

09/18/2019
by   Eyyüb Sari, et al.
0

Initialization plays a crucial role in training neural models. Binary Neural Networks (BNNs) is the most extreme quantization which often suffers from drop of accuracy. Most of neural network initialization is studied in full-prevision network setting, in which the variance of the random initialization decreases with the number of parameters per layer. We show that contrary to common belief, such popular initialization schemes are meaningless to BNNs. We analyze binary networks analytically, and propose to initialize binary weights with the same variance across different layers. We perform experiments to show the accuracy gain using this straight-forward heuristic.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset