Convergence bounds for nonlinear least squares and applications to tensor recovery

08/11/2021
by   Philipp Trunschke, et al.
0

We consider the problem of approximating a function in general nonlinear subsets of L^2 when only a weighted Monte Carlo estimate of the L^2-norm can be computed. Of particular interest in this setting is the concept of sample complexity, the number of samples that are necessary to recover the best approximation. Bounds for this quantity have been derived in a previous work and depend primarily on the model class and are not influenced positively by the regularity of the sought function. This result however is only a worst-case bound and is not able to explain the remarkable performance of iterative hard thresholding algorithms that is observed in practice. We reexamine the results of the previous paper and derive a new bound that is able to utilize the regularity of the sought function. A critical analysis of our results allows us to derive a sample efficient algorithm for the model set of low-rank tensors. The viability of this algorithm is demonstrated by recovering quantities of interest for a classical high-dimensional random partial differential equation.

READ FULL TEXT

page 9

page 16

page 17

research
08/23/2022

Convergence bounds for nonlinear least squares for tensor recovery

We consider the problem of approximating a function in general nonlinear...
research
11/14/2017

Near-optimal sample complexity for convex tensor completion

We analyze low rank tensor completion (TC) using noisy measurements of a...
research
11/18/2013

Minimum n-Rank Approximation via Iterative Hard Thresholding

The problem of recovering a low n-rank tensor is an extension of sparse ...
research
01/02/2020

Convergence bounds for empirical nonlinear least-squares

We consider best approximation problems in a (nonlinear) subspace M of a...
research
02/28/2019

Model Agnostic High-Dimensional Error-in-Variable Regression

We consider the problem of high-dimensional error-in-variable regression...
research
04/29/2021

A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression

Low-rank tensors are an established framework for high-dimensional least...
research
08/08/2017

Demixing Structured Superposition Signals from Periodic and Aperiodic Nonlinear Observations

We consider the demixing problem of two (or more) structured high-dimens...

Please sign up or login with your details

Forgot password? Click here to reset