Reconstruction under outliers for Fourier-sparse functions

07/09/2019
by   Xue Chen, et al.
0

We consider the problem of learning an unknown f with a sparse Fourier spectrum in the presence of outlier noise. In particular, the algorithm has access to a noisy oracle for (an unknown) f such that (i) the Fourier spectrum of f is k-sparse; (ii) at any query point x, the oracle returns y such that with probability 1-ρ, |y-f(x)| <ϵ. However, with probability ρ, the error y-f(x) can be arbitrarily large. We study Fourier sparse functions over both the discrete cube {0,1}^n and the torus [0,1) and for both these domains, we design efficient algorithms which can tolerate any ρ<1/2 fraction of outliers. We note that the analogous problem for low-degree polynomials has recently been studied in several works [AK03, GZ16, KKP17] and similar algorithmic guarantees are known in that setting. While our main results pertain to the case where the location of the outliers, i.e., x such that |y-f(x)|>ϵ is randomly distributed, we also study the case where the outliers are adversarially located. In particular, we show that over the torus, assuming that the Fourier transform satisfies a certain granularity condition, there is a sample efficient algorithm to tolerate ρ =Ω(1) fraction of outliers and further, that this is not possible without such a granularity condition. Finally, while not the principal thrust, our techniques also allow us non-trivially improve on learning low-degree functions f on the hypercube in the presence of adversarial outlier noise. Our techniques combine a diverse array of tools from compressive sensing, sparse Fourier transform, chaining arguments and complex analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset