Scalable Mutual Information Estimation using Dependence Graphs

01/27/2018
by   Morteza Noshad, et al.
0

We propose a unified method for empirical non-parametric estimation of general Mutual Information (MI) function between the random vectors in R^d based on N i.i.d. samples. The proposed low complexity estimator is based on a bipartite graph, referred to as dependence graph. The data points are mapped to the vertices of this graph using randomized Locality Sensitive Hashing (LSH). The vertex and edge weights are defined in terms of marginal and joint hash collisions. For a given set of hash parameters ϵ(1), ..., ϵ(k), a base estimator is defined as a weighted average of the transformed edge weights. The proposed estimator, called the ensemble dependency graph estimator (EDGE), is obtained as a weighted average of the base estimators, where the weights are computed offline as the solution of a linear programming problem. EDGE achieves optimal computational complexity O(N), and can achieve the optimal parametric MSE rate of O(1/N) if the density is d times differentiable. To the best of our knowledge EDGE is the first non-parametric MI estimator that can achieve parametric MSE rates with linear time complexity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset