Universal characteristics of deep neural network loss surfaces from random matrix theory

05/17/2022
by   Nicholas P. Baskerville, et al.
0

This paper considers several aspects of random matrix universality in deep neural networks. Motivated by recent experimental work, we use universal properties of random matrices related to local statistics to derive practical implications for deep neural networks based on a realistic model of their Hessians. In particular we derive universal aspects of outliers in the spectra of deep neural networks and demonstrate the important role of random matrix local laws in popular pre-conditioning gradient descent algorithms. We also present insights into deep neural network loss surfaces from quite general arguments based on tools from statistical physics and random matrix theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset