Re-examination of Bregman functions and new properties of their divergences

03/01/2018
by   Daniel Reem, et al.
0

The Bregman divergence (Bregman distance, Bregman measure of distance) is a certain useful substitute for a distance, obtained from a well-chosen function (the "Bregman function"). Bregman functions and divergences have been extensively investigated during the last decades and have found applications in optimization, operations research, information theory, nonlinear analysis, machine learning and more. This paper re-examines various aspects related to the theory of Bregman functions and divergences. In particular, it presents many sufficient conditions which allow the construction of Bregman functions in a general setting and introduces new Bregman functions (such as a negative iterated log entropy). Moreover, it sheds new light on several known Bregman functions such as quadratic entropies, the negative Havrda-Charvat-Tsallis entropy, and the negative Boltzmann-Gibbs-Shannon entropy, and it shows that the negative Burg entropy, which is not a Bregman function according to the classical theory but nevertheless is known to have "Bregmanian properties", can, by our re-examination of the theory, be considered as a Bregman function. Our analysis yields several by-products of independent interest such as the introduction of the concept of relative uniform convexity (a certain generalization of uniform convexity), new properties of uniformly and strongly convex functions, and results in Banach space theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2020

DLITE: The Discounted Least Information Theory of Entropy

We propose an entropy-based information measure, namely the Discounted L...
research
10/22/2018

A Family of Statistical Divergences Based on Quasiarithmetic Means

This paper proposes a generalization of Tsallis entropy and Tsallis rela...
research
07/30/2021

Representing Pareto optima in preordered spaces: from Shannon entropy to injective monotones

Shannon entropy is the most widely used measure of uncertainty. It is us...
research
06/16/2023

On Orderings of Probability Vectors and Unsupervised Performance Estimation

Unsupervised performance estimation, or evaluating how well models perfo...
research
06/29/2023

Matroidal Entropy Functions: Constructions, Characterizations and Representations

In this paper, we characterize matroidal entropy functions, i.e., entrop...
research
02/28/2018

Distance entropy cartography characterises centrality in complex networks

We introduce distance entropy as a measure of homogeneity in the distrib...
research
07/04/2021

A precise bare simulation approach to the minimization of some distances. Foundations

In information theory – as well as in the adjacent fields of statistics,...

Please sign up or login with your details

Forgot password? Click here to reset