NIF: A Framework for Quantifying Neural Information Flow in Deep Networks

01/20/2019
by   Brian Davis, et al.
0

In this paper, we present a new approach to interpreting deep learning models. More precisely, by coupling mutual information with network science, we explore how information flows through feed forward networks. We show that efficiently approximating mutual information via the dual representation of Kullback-Leibler divergence allows us to create an information measure that quantifies how much information flows between any two neurons of a deep learning model. To that end, we propose NIF, Neural Information Flow, a new metric for codifying information flow which exposes the internals of a deep learning model while providing feature attributions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset