A Fourier Approach to Mixture Learning

10/05/2022
by   Mingda Qiao, et al.
0

We revisit the problem of learning mixtures of spherical Gaussians. Given samples from mixture 1/k∑_j=1^k𝒩(μ_j, I_d), the goal is to estimate the means μ_1, μ_2, …, μ_k ∈ℝ^d up to a small error. The hardness of this learning problem can be measured by the separation Δ defined as the minimum distance between all pairs of means. Regev and Vijayaraghavan (2017) showed that with Δ = Ω(√(log k)) separation, the means can be learned using poly(k, d) samples, whereas super-polynomially many samples are required if Δ = o(√(log k)) and d = Ω(log k). This leaves open the low-dimensional regime where d = o(log k). In this work, we give an algorithm that efficiently learns the means in d = O(log k/loglog k) dimensions under separation d/√(log k) (modulo doubly logarithmic factors). This separation is strictly smaller than √(log k), and is also shown to be necessary. Along with the results of Regev and Vijayaraghavan (2017), our work almost pins down the critical separation threshold at which efficient parameter learning becomes possible for spherical Gaussian mixtures. More generally, our algorithm runs in time poly(k)· f(d, Δ, ϵ), and is thus fixed-parameter tractable in parameters d, Δ and ϵ. Our approach is based on estimating the Fourier transform of the mixture at carefully chosen frequencies, and both the algorithm and its analysis are simple and elementary. Our positive results can be easily extended to learning mixtures of non-Gaussian distributions, under a mild condition on the Fourier spectrum of the distribution.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset