Generalized Decomposition of Multivariate Information

by   Thomas F. Varley, et al.

Since its introduction, the partial information decomposition (PID) has emerged as a powerful, information-theoretic technique useful for studying the structure of (potentially higher-order) interactions in complex systems. Despite its utility, the applicability of the PID is restricted by the need to assign elements as either inputs or targets, as well as the specific structure of the mutual information itself. Here, we introduce a generalized information decomposition that relaxes the source/target distinction while still satisfying the basic intuitions about information. This approach is based on the decomposition of the Kullback-Leibler divergence, and consequently allows for the analysis of any information gained when updating from an arbitrary prior to an arbitrary posterior. Consequently, any information-theoretic measure that can be written in as a Kullback-Leibler divergence admits a decomposition in the style of Williams and Beer, including the total correlation, the negentropy, and the mutual information as special cases. In this paper, we explore how the generalized information decomposition can reveal novel insights into existing measures, as well as the nature of higher-order synergies. We show that synergistic information is intimately related to the well-known Tononi-Sporns-Edelman (TSE) complexity, and that synergistic information requires a similar integration/segregation balance as a high TSE complexity. Finally, we end with a discussion of how this approach fits into other attempts to generalize the PID and the possibilities for empirical applications.


page 1

page 2

page 3

page 4


Higher-order in-and-outeractions reveal synergy and logical dependence beyond Shannon-information

Information-theoretic quantities reveal dependencies among variables in ...

Domain Divergences: a Survey and Empirical Analysis

Domain divergence plays a significant role in estimating the performance...

Estimating mutual information and multi--information in large networks

We address the practical problems of estimating the information relation...

Flickering emergences: The question of locality in information-theoretic approaches to emergence

"Emergence", the phenomenon where a complex system displays properties, ...

Generalized Bregman Divergence and Gradient of Mutual Information for Vector Poisson Channels

We investigate connections between information-theoretic and estimation-...

Decomposing information into copying versus transformation

In many real-world systems, information can be transmitted in two qualit...

Higher-order Organization in the Human Brain from Matrix-Based Rényi's Entropy

Pairwise metrics are often employed to estimate statistical dependencies...

Please sign up or login with your details

Forgot password? Click here to reset