Deep neural network approximation of analytic functions

04/05/2021
by   Aleksandr Beknazaryan, et al.
0

We provide an entropy bound for the spaces of neural networks with piecewise linear activation functions, such as the ReLU and the absolute value functions. This bound generalizes the known entropy bound for the space of linear functions on ℝ^d and it depends on the value at the point (1,1,...,1) of the networks obtained by taking the absolute values of all parameters of original networks. Keeping this value together with the depth, width and the parameters of the networks to have logarithmic dependence on 1/ε, we ε-approximate functions that are analytic on certain regions of ℂ^d. As a statistical application we derive an oracle inequality for the expected error of the considered penalized deep neural network estimators.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset