Skip Connections Eliminate Singularities

01/31/2017
by   A. Emin Orhan, et al.
0

Skip connections made the training of very deep networks possible and have become an indispensable component in a variety of neural architectures. A completely satisfactory explanation for their success remains elusive. Here, we present a novel explanation for the benefits of skip connections in training very deep networks. The difficulty of training deep networks is partly due to the singularities caused by the non-identifiability of the model. Two such singularities have been identified in previous work: (i) overlap singularities caused by the permutation symmetry of nodes in a given layer and (ii) elimination singularities corresponding to the elimination, i.e. consistent deactivation, of nodes. These singularities cause degenerate manifolds in the loss landscape previously shown to slow down learning. We argue that skip connections eliminate these singularities by breaking the permutation symmetry of nodes and by reducing the possibility of node elimination. Moreover, for typical initializations, skip connections move the network away from the "ghosts" of these singularities and sculpt the landscape around them to alleviate the learning slow-down. These hypotheses are supported by evidence from simplified models, as well as from experiments with fully-connected deep networks trained on CIFAR-10 and CIFAR-100.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset