Why Simple Quadrature is just as good as Monte Carlo
We motive and calculate Newton-Cotes quadrature integration variance and compare it directly with Monte Carlo (MC) integration variance. We find an equivalence between deterministic quadrature sampling and random MC sampling by noting that random function sampling is statistically indistinguishable from a method that uses deterministic sampling on a randomly shuffled (permuted) function. We use this statistical equivalence to regularize the possible types of Bayesian priors one can put on quadrature integration such that they are informationally consistent with theoretical (frequentist) MC probabilities even if the integrand function or sampling isn't randomized in any way. This leads to the proof that composite rectangular and midpoint quadrature integrations have expected variances that are less than or equal to their corresponding theoretical MC integration variances. Separately, using Bayesian probability theory, we find that the theoretical standard deviations of the unbiased errors of some Newton-Cotes composite quadrature integrations improve over their worst case errors by an extra dimension independent factor ∝ N^-1/2, where N is the number of samples taken. This improves the Newton-Cotes error analysis and shows which quadrature methods can be implemented reliably in higher dimensions. This is validated in our simulations.
READ FULL TEXT