A Riemannian Mean Field Formulation for Two-layer Neural Networks with Batch Normalization

10/17/2021
by   Chao Ma, et al.
19

The training dynamics of two-layer neural networks with batch normalization (BN) is studied. It is written as the training dynamics of a neural network without BN on a Riemannian manifold. Therefore, we identify BN's effect of changing the metric in the parameter space. Later, the infinite-width limit of the two-layer neural networks with BN is considered, and a mean-field formulation is derived for the training dynamics. The training dynamics of the mean-field formulation is shown to be the Wasserstein gradient flow on the manifold. Theoretical analysis are provided on the well-posedness and convergence of the Wasserstein gradient flow.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset