Quantifying multivariate redundancy with maximum entropy decompositions of mutual information
Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of redundancy lattices, which allows separating the information that a set of variables contains about a target variable into nonnegative components interpretable as the unique information of some variables not contained in others as well as redundant and synergistic components. However, the definition of multivariate measures of redundancy that comply with nonnegativity and conform to certain axioms that capture conceptually desirable properties of redundancy has proven to be elusive. We here present a procedure to determine multivariate redundancy measures, within the framework of maximum entropy models. In particular, we generalize existing bivariate maximum entropy-based measures of redundancy and unique information, defining measures of the redundant information that a group of variables has about a target, and of the unique redundant information that a group of variables has about a target that is not redundant with information from another group. The two key ingredients for this approach are: First, the identification of a type of constraints on entropy maximization that allows isolating components of redundancy and unique redundancy by mirroring them to synergy components. Second, the construction of rooted tree-based decompositions to breakdown mutual information, ensuring nonnegativity by the local implementation of maximum entropy information projections at each binary unfolding of the tree nodes. Altogether, the proposed measures are nonnegative and conform to the desirable axioms for redundancy measures.
READ FULL TEXT