Almost Tight Approximation Algorithms for Explainable Clustering

07/01/2021
by   Hossein Esfandiari, et al.
0

Recently, due to an increasing interest for transparency in artificial intelligence, several methods of explainable machine learning have been developed with the simultaneous goal of accuracy and interpretability by humans. In this paper, we study a recent framework of explainable clustering first suggested by Dasgupta et al. <cit.>. Specifically, we focus on the k-means and k-medians problems and provide nearly tight upper and lower bounds. First, we provide an O(log k loglog k)-approximation algorithm for explainable k-medians, improving on the best known algorithm of O(k) <cit.> and nearly matching the known Ω(log k) lower bound <cit.>. In addition, in low-dimensional spaces d ≪log k, we show that our algorithm also provides an O(d log^2 d)-approximate solution for explainable k-medians. This improves over the best known bound of O(d log k) for low dimensions <cit.>, and is a constant for constant dimensional spaces. To complement this, we show a nearly matching Ω(d) lower bound. Next, we study the k-means problem in this context and provide an O(k log k)-approximation algorithm for explainable k-means, improving over the O(k^2) bound of Dasgupta et al. and the O(d k log k) bound of <cit.>. To complement this we provide an almost tight Ω(k) lower bound, improving over the Ω(log k) lower bound of Dasgupta et al. Given an approximate solution to the classic k-means and k-medians, our algorithm for k-medians runs in time O(kd log^2 k ) and our algorithm for k-means runs in time O(k^2 d).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset