The Mutual Information In The Vicinity of Capacity-Achieving Input Distributions

04/27/2023
by   Hao-Chung Cheng, et al.
0

The mutual information is analyzed as a function of the input distribution using an identity due to Topsøe for channels with (possibly multiple) linear cost constraints and finite input and output sets. The mutual information is bounded above by a function decreasing quadratically with the distance to the set of all capacity-achieving input distributions for the case when the distance is less than a certain threshold. The closed-form expressions for the threshold and the coefficient of the quadratic decrease are derived. A counter-example demonstrating the non-existence of such a quadratic bound in the case of infinitely many linear cost constraints is provided. Implications of these observations for the channel coding problem and applications of the proof technique to related problems are discussed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset