Statistical Mechanics of Generalization in Kernel Regression

06/23/2020
by   Abdulkadir Canatar, et al.
36

Generalization beyond a training dataset is a main goal of machine learning. We investigate generalization error in kernel regression using statistical mechanics, deriving an analytical expression applicable to any kernel. We discuss applications to a kernel with finite number of spectral modes. Then, focusing on the broad class of rotation invariant kernels, which is relevant to training deep neural networks in the infinite-width limit, we show several phenomena. When data is drawn from a spherically symmetric distribution and the number of input dimensions, D, is large, we find that multiple learning stages exist, one for each scaling of the number of training samples with 𝒪_D(D^K) where K∈ Z^+. The behavior of the learning curve in each stage is related to an effective noise and regularizer that are related to the tail of the kernel and target function spectra. When effective regularization is zero, we identify a first order phase transition that corresponds to a divergence in the generalization error. Each learning stage can exhibit sample-wise double descent, where learning curves show non-monotonic sample size dependence. For each stage an optimal value of effective regularizer exists, equal to the effective noise variance, that gives minimum generalization error.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset