On the Upper Bound of the Kullback-Leibler Divergence and Cross Entropy
This archiving article consists of several short reports on the discussions between the two authors over the past two years at Oxford and Madrid, and their work carried out during that period on the upper bound of the Kullback-Leibler divergence and cross entropy. The work was motivated by the cost-benefit ratio proposed by Chen and Golan [1], and the less desirable property that the Kullback-Leibler (KL) divergence used in the measure is unbounded. The work subsequently (i) confirmed that the KL-divergence used in the cost-benefit ratio should exhibit a bounded property, (ii) proposed a new divergence measure, and (iii) compared this new divergence measure with a few other bounded measures.
READ FULL TEXT