ITENE: Intrinsic Transfer Entropy Neural Estimator

by   Jingjing Zhang, et al.

Quantifying the directionality of information flow is instrumental in understanding, and possibly controlling, the operation of many complex systems, such as transportation, social, neural, or gene-regulatory networks. The standard Transfer Entropy (TE) metric follows Granger's causality principle by measuring the Mutual Information (MI) between the past states of a source signal X and the future state of a target signal Y while conditioning on past states of Y. Hence, the TE quantifies the improvement (as measured by the log-loss) in the prediction of the target sequence Y that can be accrued when, in addition to the past of Y, one also has available past samples from X. However, by conditioning on the past of Y, the TE also measures information that can be synergistically extracted by observing both the past of X and Y, and not solely the past of X. Building on a private key agreement formulation, the Intrinsic TE (ITE) aims to discount such synergistic information to quantify the degree to which X is individually predictive of Y, independent of Y's past. In this paper, an estimator of the ITE is proposed that is inspired by the recently proposed Mutual Information Neural Estimation (MINE). The estimator is based on variational bound on the KL divergence, two-sample neural network classifiers, and the pathwise estimator of Monte Carlo gradients.


page 1

page 2

page 3

page 4


Modes of Information Flow

Information flow between components of a system takes many forms and is ...

Estimating Kullback-Leibler Divergence Using Kernel Machines

Recently, a method called the Mutual Information Neural Estimator (MINE)...

Quantifying the predictability of visual scanpaths using active information storage

Entropy-based measures are an important tool for studying human gaze beh...

Optimized Bacteria are Environmental Prediction Engines

Experimentalists have observed phenotypic variability in isogenic bacter...

Quantum chi-squared tomography and mutual information testing

For quantum state tomography on rank-r dimension-d states, we show that ...

Cumulant Expansion of Mutual Information for Quantifying Leakage of a Protected Secret

The information leakage of a cryptographic implementation with a given d...

Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space

Estimating Kullback Leibler (KL) divergence from samples of two distribu...

Please sign up or login with your details

Forgot password? Click here to reset