Risk upper bounds for RKHS ridge group sparse estimator in the regression model with non-Gaussian and non-bounded error

09/22/2020
by   Halaleh Kamari, et al.
0

We consider the problem of estimating a meta-model of an unknown regression model with non-Gaussian and non-bounded error. The meta-model belongs to a reproducing kernel Hilbert space constructed as a direct sum of Hilbert spaces leading to an additive decomposition including the variables and interactions between them. The estimator of this meta-model is calculated by minimizing an empirical least-squares criterion penalized by the sum of the Hilbert norm and the empirical L^2-norm. In this context, the upper bounds of the empirical L^2 risk and the L^2 risk of the estimator are established.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/01/2020

On the Improved Rates of Convergence for Matérn-type Kernel Ridge Regression, with Application to Calibration of Computer Models

Kernel ridge regression is an important nonparametric method for estimat...
research
06/15/2013

Early stopping and non-parametric regression: An optimal data-dependent stopping rule

The strategy of early stopping is a regularization technique based on ch...
research
08/27/2019

On the Risk of Minimum-Norm Interpolants and Restricted Lower Isometry of Kernels

We study the risk of minimum-norm interpolants of data in a Reproducing ...
research
04/01/2021

An Online Projection Estimator for Nonparametric Regression in Reproducing Kernel Hilbert Spaces

The goal of nonparametric regression is to recover an underlying regress...
research
08/10/2020

Deterministic error bounds for kernel-based learning techniques under bounded noise

We consider the problem of reconstructing a function from a finite set o...
research
12/06/2012

Excess risk bounds for multitask learning with trace norm regularization

Trace norm regularization is a popular method of multitask learning. We ...

Please sign up or login with your details

Forgot password? Click here to reset