On the Tightness of the Laplace Approximation for Statistical Inference

10/17/2022
by   Blair Bilodeau, et al.
0

Laplace's method is used to approximate intractable integrals in a wide range of statistical problems, including Bayesian inference and frequentist marginal likelihood models. It is classically known that the relative error rate of the approximation is not worse than O_p(n^-1) under standard regularity conditions, where n is the sample size. It is unknown whether the error rate can be better than O_p(n^-1) in common applications. We provide the first statistical lower bounds showing that the n^-1 rate is tight. We prove stochastic lower bounds for two simple models: Bayesian inference on fair coin flips, and frequentist marginal likelihood estimation for an over-dispersed Poisson model. We conclude that any set of assumptions under which a faster rate can be derived must be so restrictive as to exclude these simple models, and hence the n^-1 rate is, for practical purposes, the best that can be obtained.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro