Statistical inference for the slope parameter in functional linear regression
In this paper we consider the linear regression model Y =S X+ε with functional regressors and responses. We develop new inference tools to quantify deviations of the true slope S from a hypothesized operator S_0 with respect to the Hilbert–Schmidt norm S- S_0^2, as well as the prediction error 𝔼 S X - S_0 X ^2. Our analysis is applicable to functional time series and based on asymptotically pivotal statistics. This makes it particularly user friendly, because it avoids the choice of tuning parameters inherent in long-run variance estimation or bootstrap of dependent data. We also discuss two sample problems as well as change point detection. Finite sample properties are investigated by means of a simulation study. Mathematically our approach is based on a sequential version of the popular spectral cut-off estimator Ŝ_N for S. It is well-known that the L^2-minimax rates in the functional regression model, both in estimation and prediction, are substantially slower than 1/√(N) (where N denotes the sample size) and that standard estimators for S do not converge weakly to non-degenerate limits. However, we demonstrate that simple plug-in estimators - such as Ŝ_N - S_0 ^2 for S - S_0 ^2 - are √(N)-consistent and its sequential versions satisfy weak invariance principles. These results are based on the smoothing effect of L^2-norms and established by a new proof-technique, the smoothness shift, which has potential applications in other statistical inverse problems.
READ FULL TEXT