Order-Optimal Error Bounds for Noisy Kernel-Based Bayesian Quadrature
In this paper, we study the sample complexity of noisy Bayesian quadrature (BQ), in which we seek to approximate an integral based on noisy black-box queries to the underlying function. We consider functions in a Reproducing Kernel Hilbert Space (RKHS) with the Matérn-ν kernel, focusing on combinations of the parameter ν and dimension d such that the RKHS is equivalent to a Sobolev class. In this setting, we provide near-matching upper and lower bounds on the best possible average error. Specifically, we find that when the black-box queries are subject to Gaussian noise having variance σ^2, any algorithm making at most T queries (even with adaptive sampling) must incur a mean absolute error of Ω(T^-ν/d-1 + σ T^-1/2), and there exists a non-adaptive algorithm attaining an error of at most O(T^-ν/d-1 + σ T^-1/2). Hence, the bounds are order-optimal, and establish that there is no adaptivity gap in terms of scaling laws.
READ FULL TEXT