Adaptive minimax optimality in statistical inverse problems via SOLIT – Sharp Optimal Lepskii-Inspired Tuning
We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form f̂_α = q_α(T^*T)T^*Y, where Y is the available data, T the forward operator, (q_α)_α∈𝒜 an ordered filter, and α > 0 a regularization parameter. Whenever such a method is used in practice, α has to be chosen appropriately. Typically, the aim is to find or at least approximate the best possible α in the sense that mean squared error (MSE) 𝔼 [‖f̂_α - f^†‖^2] w.r.t. the true solution f^† is minimized. In this paper, we introduce the Sharp Optimal Lepskiĭ-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence. It depends only on Y and the noise level σ as well as the operator T and the filter (q_α)_α∈𝒜 and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepskiii-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.
READ FULL TEXT