An Effective Bernstein-type Bound on Shannon Entropy over Countably Infinite Alphabets

06/23/2021
by   Yunpeng Zhao, et al.
0

We prove a Bernstein-type bound for the difference between the average of negative log-likelihoods of independent discrete random variables and the Shannon entropy, both defined on a countably infinite alphabet. The result holds for the class of discrete random variables with tails lighter than or on the same order of a discrete power-law distribution. Most commonly-used discrete distributions such as the Poisson distribution, the negative binomial distribution, and the power-law distribution itself belong to this class. The bound is effective in the sense that we provide a method to compute the constants in it.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2019

An Entropy Power Inequality for Discrete Random Variables

Let N_ d[X]=1/2π e e^2H[X] denote the entropy power of the discrete rand...
research
11/09/2021

The Jacobi Theta Distribution

We form the Jacobi theta distribution through discrete integration of ex...
research
09/25/2018

Some Characterizations and Properties of COM-Poisson Random Variables

This paper introduces some new characterizations of COM-Poisson random v...
research
06/09/2022

Negative Shannon Information Hides Networks

Negative numbers are essential in mathematics. They are not needed to de...
research
08/13/2018

On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs

Suppose that you have n colours and m mutually independent dice, each of...
research
08/13/2020

Infinite Divisibility of Information

We study an information analogue of infinitely divisible probability dis...
research
11/14/2019

Estimating differential entropy using recursive copula splitting

A method for estimating the Shannon differential entropy of multidimensi...

Please sign up or login with your details

Forgot password? Click here to reset