On the Properties of Kullback-Leibler Divergence Between Gaussians

02/10/2021
by   Yufeng Zhang, et al.
0

Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two n-dimensional Gaussians 𝒩_1 and 𝒩_2, we find the supremum of KL(𝒩_1||𝒩_2) when KL(𝒩_2||𝒩_1)≤ϵ for ϵ>0. This reveals the approximate symmetry of small KL divergence between Gaussians. We also find the infimum of KL(𝒩_1||𝒩_2) when KL(𝒩_2||𝒩_1)≥ M for M>0. Secondly, for any three n-dimensional Gaussians 𝒩_1, 𝒩_2 and 𝒩_3, we find a bound of KL(𝒩_1||𝒩_3) if KL(𝒩_1||𝒩_2) and KL(𝒩_2||𝒩_3) are bounded. This reveals that the KL divergence between Gaussians follows a relaxed triangle inequality. Importantly, all the bounds in the theorems presented in this paper are independent of the dimension n.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset