The deficit in an entropic inequality

05/29/2018
by   James Melbourne, et al.
0

In this article, we investigate the entropy of a sum of a discrete and a continuous random variable. Bounds for estimating the entropy of the sum are obtained for the cases when the continuous random variable is Gaussian or log-concave. Bounds on the capacity of a channel where the discrete random variable is the input and the output is the input corrupted by additive noise modeled by the continuous random variable are obtained. The bounds are shown to be sharp in the case that the discrete variable is Bernoulli.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset