Effective Minkowski Dimension of Deep Nonparametric Regression: Function Approximation and Statistical Theories

by   Zixuan Zhang, et al.

Existing theories on deep nonparametric regression have shown that when the input data lie on a low-dimensional manifold, deep neural networks can adapt to the intrinsic data structures. In real world applications, such an assumption of data lying exactly on a low dimensional manifold is stringent. This paper introduces a relaxed assumption that the input data are concentrated around a subset of ℝ^d denoted by 𝒮, and the intrinsic dimension of 𝒮 can be characterized by a new complexity notation – effective Minkowski dimension. We prove that, the sample complexity of deep nonparametric regression only depends on the effective Minkowski dimension of 𝒮 denoted by p. We further illustrate our theoretical findings by considering nonparametric regression with an anisotropic Gaussian random design N(0,Σ), where Σ is full rank. When the eigenvalues of Σ have an exponential or polynomial decay, the effective Minkowski dimension of such an Gaussian random design is p=𝒪(√(log n)) or p=𝒪(n^γ), respectively, where n is the sample size and γ∈(0,1) is a small constant depending on the polynomial decay rate. Our theory shows that, when the manifold assumption does not hold, deep neural networks can still adapt to the effective Minkowski dimension of the data, and circumvent the curse of the ambient dimensionality for moderate sample sizes.


page 1

page 2

page 3

page 4


Efficient Approximation of Deep ReLU Networks for Functions on Low Dimensional Manifolds

Deep neural networks have revolutionized many real world applications, d...

Besov Function Approximation and Binary Classification on Low-Dimensional Manifolds Using Convolutional Residual Networks

Most of existing statistical theories on deep neural networks have sampl...

Deep Nonparametric Estimation of Intrinsic Data Structures by Chart Autoencoders: Generalization Error and Robustness

Autoencoders have demonstrated remarkable success in learning low-dimens...

Nonparametric Classification on Low Dimensional Manifolds using Overparameterized Convolutional Residual Networks

Convolutional residual neural networks (ConvResNets), though overparamet...

Doubly Robust Off-Policy Learning on Low-Dimensional Manifolds by Deep Neural Networks

Causal inference explores the causation between actions and the conseque...

LDMNet: Low Dimensional Manifold Regularized Neural Networks

Deep neural networks have proved very successful on archetypal tasks for...

Deep Nonparametric Estimation of Operators between Infinite Dimensional Spaces

Learning operators between infinitely dimensional spaces is an important...

Please sign up or login with your details

Forgot password? Click here to reset