Expressivity of Shallow and Deep Neural Networks for Polynomial Approximation

03/06/2023
by   Itai Shapira, et al.
0

We analyze the number of neurons that a ReLU neural network needs to approximate multivariate monomials. We establish an exponential lower bound for the complexity of any shallow network that approximates the product function x⃗→∏_i=1^d x_i on a general compact domain. Furthermore, we prove that this lower bound does not hold for normalized O(1)-Lipschitz monomials (or equivalently, by restricting to the unit cube). These results suggest shallow ReLU networks suffer from the curse of dimensionality when expressing functions with a Lipschitz parameter scaling with the dimension of the input, and that the expressive power of neural networks lies in their depth rather than the overall complexity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset