Infinite Divisibility of Information

08/13/2020
by   Cheuk Ting Li, et al.
0

We study an information analogue of infinitely divisible probability distributions, where the i.i.d. sum is replaced by the joint distribution of an i.i.d. sequence. A random variable X is called informationally infinitely divisible if, for any n≥1, there exists an i.i.d. sequence of random variables Z_1,…,Z_n that contains the same information as X, i.e., there exists an injective function f such that X=f(Z_1,…,Z_n). While there does not exist informationally infinitely divisible discrete random variable, we show that any discrete random variable X has a bounded multiplicative gap to infinite divisibility, that is, if we remove the injectivity requirement on f, then there exists i.i.d. Z_1,…,Z_n and f satisfying X=f(Z_1,…,Z_n), and the entropy satisfies H(X)/n≤ H(Z_1)≤1.59H(X)/n+2.43. We also study a new class of discrete probability distributions, called spectral infinitely divisible distributions, where we can remove the multiplicative gap 1.59. Furthermore, we study the case where X=(Y_1,…,Y_m) is itself an i.i.d. sequence, m≥2, for which the multiplicative gap 1.59 can be replaced by 1+5√((log m)/m). This means that as m increases, (Y_1,…,Y_m) becomes closer to being spectral infinitely divisible in a uniform manner. This can be regarded as an information analogue of Kolmogorov's uniform theorem. Applications of our result include independent component analysis, distributed storage with a secrecy constraint, and distributed random number generation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset