Learning Discretized Neural Networks under Ricci Flow

02/07/2023
by   Jun Chen, et al.
0

In this paper, we consider Discretized Neural Networks (DNNs) consisting of low-precision weights and activations, which suffer from either infinite or zero gradients caused by the non-differentiable discrete function in the training process. In this case, most training-based DNNs use the standard Straight-Through Estimator (STE) to approximate the gradient w.r.t. discrete values. However, the STE will cause the problem of gradient mismatch, which implies that the approximated gradient is with perturbations. We propose an analysis that this mismatch can be viewed as a metric perturbation in a Riemannian manifold through the lens of duality theory. To address this problem, based on the information geometry, we construct the Linearly Nearly Euclidean (LNE) manifold for DNNs as a background to deal with perturbations. By introducing a partial differential equation on metrics, the Ricci flow, we prove the dynamical stability and convergence of the LNE metric with the L^2-norm perturbation. And unlike the previous perturbation theory which gives the rate of convergence is the fractional powers, we yield the metric perturbation under the Ricci flow can be exponentially decayed in the LNE manifold. The experimental results on various datasets demonstrate that our method achieves better and more stable performance for DNNs than other representative training-based methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset