Efficient active learning of sparse halfspaces

05/07/2018
by   Chicheng Zhang, et al.
0

We study the problem of efficient PAC active learning of homogeneous linear classifiers (halfspaces) in R^d, where the goal is to learn a halfspace with low error, using as few label queries as possible. Given the extra assumption that there is a t-sparse halfspace that performs well on the data (t ≪ d), we would like our active learning algorithm to be attribute efficient, i.e. to have label requirements sublinear in d. In this paper, we provide a computationally efficient algorithm that achieves this goal. Under certain distributional assumptions on the data, our algorithm achieves a label complexity of O(t polylog(d, 1/ϵ)). In contrast, existing algorithms in this setting are either computationally inefficient, or subject to label requirements polynomial in d or 1/ϵ.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset