A Training-Based Mutual Information Lower Bound for Large-Scale Systems

07/30/2021
by   Xiangbo Meng, et al.
0

We provide a mutual information lower bound that can be used to analyze the effect of training in models with unknown parameters. For large-scale systems, we show that this bound can be calculated using the difference between two derivatives of a conditional entropy function. The bound does not require explicit estimation of the unknown parameters. We provide a step-by-step process for computing the bound, and provide an example application. A comparison with known classical mutual information bounds is provided.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2020

Analyzing Training Using Phase Transitions in Entropy—Part I: General Theory

We analyze phase transitions in the conditional entropy of a sequence ca...
research
12/04/2018

A Tight Upper Bound on Mutual Information

We derive a tight lower bound on equivocation (conditional entropy), or ...
research
11/16/2020

Regularized Mutual Information Neural Estimation

With the variational lower bound of mutual information (MI), the estimat...
research
11/23/2022

Mutual Information Learned Regressor: an Information-theoretic Viewpoint of Training Regression Systems

As one of the central tasks in machine learning, regression finds lots o...
research
11/10/2018

Formal Limitations on the Measurement of Mutual Information

Motivate by applications to unsupervised learning, we consider the probl...
research
02/19/2021

Sequential- and Parallel- Constrained Max-value Entropy Search via Information Lower Bound

Recently, several Bayesian optimization (BO) methods have been extended ...
research
11/06/2019

Conditional Mutual Information Neural Estimator

Several recent works in communication systems have proposed to leverage ...

Please sign up or login with your details

Forgot password? Click here to reset