Deep Neural Network Approximation For Hölder Functions

01/11/2022
by   Ahmed Abdeljawad, et al.
0

In this work, we explore the approximation capability of deep Rectified Quadratic Unit neural networks for Hölder-regular functions, with respect to the uniform norm. We find that theoretical approximation heavily depends on the selected activation function in the neural network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2019

Smooth function approximation by deep neural networks with general activation functions

There has been a growing interest in expressivity of deep neural network...
research
12/02/2018

On variation of gradients of deep neural networks

We provide a theoretical explanation of the role of the number of nodes ...
research
10/28/2019

Growing axons: greedy learning of neural networks with application to function approximation

We propose a new method for learning deep neural network models that is ...
research
04/19/2023

Points of non-linearity of functions generated by random neural networks

We consider functions from the real numbers to the real numbers, output ...
research
07/23/2020

Nonclosedness of the Set of Neural Networks in Sobolev Space

We examine the closedness of the set of realized neural networks of a fi...
research
12/16/2021

Approximation of functions with one-bit neural networks

This paper examines the approximation capabilities of coarsely quantized...
research
07/12/2020

Universal Approximation Power of Deep Neural Networks via Nonlinear Control Theory

In this paper, we explain the universal approximation capabilities of de...

Please sign up or login with your details

Forgot password? Click here to reset