Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov noise

02/10/2021
by   Chicheng Zhang, et al.
0

We develop a computationally-efficient PAC active learning algorithm for d-dimensional homogeneous halfspaces that can tolerate Massart noise <cit.> and Tsybakov noise <cit.>. Specialized to the η-Massart noise setting, our algorithm achieves an information-theoretic optimal label complexity of Õ( d/(1-2η)^2polylog(1/ϵ) ) under a wide range of unlabeled data distributions (specifically, the family of "structured distributions" defined in <cit.>). Under the more challenging Tsybakov noise condition, we identify two subfamilies of noise conditions, under which our algorithm achieves computational efficiency and provide label complexity guarantees strictly lower than passive learning algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset