A Maximum Entropy Procedure to Solve Likelihood Equations

04/22/2019
by   Antonio Calcagnì, et al.
0

In this article we provide initial findings regarding the problem of solving likelihood equations by means of a maximum entropy approach. Unlike standard procedures that require equating at zero the score function of the maximum-likelihood problem, we propose an alternative strategy where the score is instead used as external informative constraint to the maximization of the convex Shannon's entropy function. The problem involves the re-parameterization of the score parameters as expected values of discrete probability distributions where probabilities need to be estimated. This leads to a simpler situation where parameters are searched in smaller (hyper) simplex space. We assessed our proposal by means of empirical case studies and a simulation study, this latter involving the most critical case of logistic regression under data separation. The results suggested that the maximum entropy re-formulation of the score problem solves the likelihood equation problem. Similarly, when maximum-likelihood estimation is difficult, as for the case of logistic regression under separation, the maximum entropy proposal achieved results (numerically) comparable to those obtained by the Firth's Bias-corrected approach. Overall, these first findings reveal that a maximum entropy solution can be considered as an alternative technique to solve the likelihood equation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset