Rate of convergence of the smoothed empirical Wasserstein distance
Consider an empirical measure ā_n induced by n iid samples from a d-dimensional K-subgaussian distribution ā and let Ī³ = š©(0,Ļ^2 I_d) be the isotropic Gaussian measure. We study the speed of convergence of the smoothed Wasserstein distance W_2(ā_n * Ī³, ā*Ī³) = n^-Ī± + o(1) with * being the convolution of measures. For K<Ļ and in any dimension dā„ 1 we show that Ī± = 12. For K>Ļ in dimension d=1 we show that the rate is slower and is given by Ī± = (Ļ^2 + K^2)^2 4 (Ļ^4 + K^4) < 1/2. This resolves several open problems in <cit.>, and in particular precisely identifies the amount of smoothing Ļ needed to obtain a parametric rate. In addition, we also establish that D_KL(ā_n * Ī³ā*Ī³) has rate O(1/n) for K<Ļ but only slows down to O((log n)^d+1 n) for K>Ļ. The surprising difference of the behavior of W_2^2 and KL implies the failure of T_2-transportation inequality when Ļ < K. Consequently, the requirement K<Ļ is necessary for validity of the log-Sobolev inequality (LSI) for the Gaussian mixture ā * š©(0, Ļ^2), closing an open problem in <cit.>, who established the LSI under precisely this condition.
READ FULL TEXT