Optimal Guessing under Nonextensive Framework and associated Moment Bounds

05/19/2019
by   Abhik Ghosh, et al.
0

We consider the problem of guessing the realization of a random variable but under more general Tsallis' non-extensive entropic framework rather than the classical Maxwell-Boltzman-Gibbs-Shannon framework. We consider both the conditional guessing problem in the presence of some related side information, and the unconditional one where no such side-information is available. For both types of the problem, the non-extensive moment bounds of the required number of guesses are derived; here we use the q-normalized expectation in place of the usual (linear) expectation to define the non-extensive moments. These moment bounds are seen to be a function of the logarithmic norm entropy measure, a recently developed two-parameter generalization of the Renyi entropy, and hence provide their information theoretic interpretation. We have also considered the case of uncertain source distribution and derived the non-extensive moment bounds for the corresponding mismatched guessing function. These mismatched bounds are interestingly seen to be linked with an important robust statistical divergence family known as the relative (α,β)-entropies; similar link is discussed between the optimum mismatched guessing with the extremes of these relative entropy measures.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset