Analytic relations between networks: encoding, decoding, and causality

07/14/2022
by   Yang Tian, et al.
0

Networks are common in physics, biology, computer science, and social science. Quantifying the relations (e.g., similarity) between networks paves the way for understanding the latent information shared across networks. However, fundamental metrics of relations, such as information divergence, mutual information, Fisher information, and causality, are not well-defined between networks. As a compromise, commonly used strategies (e.g., network embedding, matching, and kernel approaches) measure network relations in data-driven ways. These approaches are computation-oriented and inapplicable to analytic derivations in mathematics and physics. To resolve these issues, we present a theory to derive an optimal characterization of network topological properties. Our theory shows that a network can be fully represented by a Gaussian variable defined by the discrete Schrödinger operator, which simultaneously satisfies network-topology-dependent smoothness and maximum entropy properties. Based on this characterization, we can analytically measure diverse relations between networks in terms of topology properties. As illustrations, we primarily show how to define encoding (e.g., information divergence and mutual information), decoding (e.g., Fisher information), and causality (e.g., transfer entropy and Granger causality) between networks. We validate our framework on representative networks (e.g., evolutionary random network models, protein-protein interaction network, and chemical compound networks), and demonstrate that a series of science and engineering challenges (e.g., network evolution, clustering, and classification) can be tackled from a new perspective. A computationally efficient implementation of our theory is released as an open-source toolbox.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset