Rate of convergence of the smoothed empirical Wasserstein distance

05/04/2022
āˆ™
by   Adam Block, et al.
āˆ™
0
āˆ™

Consider an empirical measure ā„™_n induced by n iid samples from a d-dimensional K-subgaussian distribution ā„™ and let Ī³ = š’©(0,Ļƒ^2 I_d) be the isotropic Gaussian measure. We study the speed of convergence of the smoothed Wasserstein distance W_2(ā„™_n * Ī³, ā„™*Ī³) = n^-Ī± + o(1) with * being the convolution of measures. For K<Ļƒ and in any dimension dā‰„ 1 we show that Ī± = 12. For K>Ļƒ in dimension d=1 we show that the rate is slower and is given by Ī± = (Ļƒ^2 + K^2)^2 4 (Ļƒ^4 + K^4) < 1/2. This resolves several open problems in <cit.>, and in particular precisely identifies the amount of smoothing Ļƒ needed to obtain a parametric rate. In addition, we also establish that D_KL(ā„™_n * Ī³ā„™*Ī³) has rate O(1/n) for K<Ļƒ but only slows down to O((log n)^d+1 n) for K>Ļƒ. The surprising difference of the behavior of W_2^2 and KL implies the failure of T_2-transportation inequality when Ļƒ < K. Consequently, the requirement K<Ļƒ is necessary for validity of the log-Sobolev inequality (LSI) for the Gaussian mixture ā„™ * š’©(0, Ļƒ^2), closing an open problem in <cit.>, who established the LSI under precisely this condition.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset