Information-Distilling Quantizers

12/07/2018
by   Alankrita Bhatt, et al.
0

Let X and Y be dependent random variables. This paper considers the problem of designing a scalar quantizer for Y to maximize the mutual information between the quantizer's output and X, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low I(X;Y), where it is shown that, if X is binary, a constant fraction of the mutual information can always be preserved using O((1/I(X;Y))) quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets 2 < |X| < ∞, it is established that an η-fraction of the mutual information can be preserved using roughly ((| X | /I(X;Y)))^η·(|X| - 1) quantization levels.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset