The information loss of a stochastic map

07/05/2021
by   James Fullwood, et al.
0

We provide a stochastic extension of the Baez-Fritz-Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call `conditional information loss.' Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an `entropic Bayes' rule' for information measures, and we provide a characterization of conditional entropy in terms of this rule.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro