Bayesian Shrinkage Approaches to Unbalanced Problems of Estimation and Prediction on the Basis of Negative Multinomial Samples
In this paper, we treat estimation and prediction problems where negative multinomial variables are observed and in particular consider unbalanced settings. First, the problem of estimating multiple negative multinomial parameter vectors under the standardized squared error loss is treated and a new empirical Bayes estimator which dominates the UMVU estimator under suitable conditions is derived. Second, we consider estimation of the joint predictive density of several multinomial tables under the Kullback-Leibler divergence and obtain a sufficient condition under which the Bayesian predictive density with respect to a hierarchical shrinkage prior dominates the Bayesian predictive density with respect to the Jeffreys prior. Third, our proposed Bayesian estimator and predictive density give risk improvements in simulations. Finally, the problem of estimating the joint predictive density of negative multinomial variables is discussed.
READ FULL TEXT