The empirical likelihood prior applied to bias reduction of general estimating equations

08/19/2018
by   Albert Vexler, et al.
0

The practice of employing empirical likelihood (EL) components in place of parametric likelihood functions in the construction of Bayesian-type procedures has been well-addressed in the modern statistical literature. We rigorously derive the EL prior, a Jeffreys-type prior, which asymptotically maximizes the Shannon mutual information between data and the parameters of interest. The focus of our approach is on an integrated Kullback-Leibler distance between the EL-based posterior and prior density functions. The EL prior density is the density function for which the corresponding posterior form is asymptotically negligibly different from the EL. We show that the proposed result can be used to develop a methodology for reducing the asymptotic bias of solutions of general estimating equations and M-estimation schemes by removing the first-order term. This technique is developed in a similar manner to methods employed to reduce the asymptotic bias of maximum likelihood estimates via penalizing the underlying parametric likelihoods by their Jeffreys invariant priors. A real data example related to a study of myocardial infarction illustrates the attractiveness of the proposed technique in practical aspects. Keywords: Asymptotic bias, Biased estimating equations, Empirical likelihood, Expected Kullback-Leibler distance, Penalized likelihood, Reference prior.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset