Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data

02/02/2023
by   Jonathan W. Siegel, et al.
0

We study the interpolation, or memorization, power of deep ReLU neural networks. Specifically, we consider the question of how efficiently, in terms of the number of parameters, deep ReLU networks can interpolate values at N datapoints in the unit ball which are separated by a distance δ. We show that Ω(N) parameters are required in the regime where δ is exponentially small in N, which gives the sharp result in this regime since O(N) parameters are always sufficient. This also shows that the bit-extraction technique used to prove lower bounds on the VC dimension cannot be applied to irregularly spaced datapoints.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset