Noisy Interpolation Learning with Shallow Univariate ReLU Networks
We study the asymptotic overfitting behavior of interpolation with minimum norm (ℓ_2 of the weights) two-layer ReLU networks for noisy univariate regression. We show that overfitting is tempered for the L_1 loss, and any L_p loss for p<2, but catastrophic for p≥ 2.
READ FULL TEXT