Approximation Guarantees of Stochastic Greedy Algorithms for Non-monotone Submodular Maximization with a Size Constraint
The stochastic greedy algorithm (SG) is a randomized version of the greedy algorithm for submodular maximization with a size constraint. SG is highly practical since it is fast, delivers high empirical performance, and is easy to implement. However, its approximation guarantee has been proved only for monotone objective functions; this is natural since the original greedy algorithm is known to perform arbitrarily poorly for non-monotone objectives in general. In this paper, contrary to the expectation, we prove an interesting result: Thanks to the randomization, SG (with slight modification) can achieve almost 1/4-approximation guarantees in expectation even for non-monotone objective functions. Our result provides practical and theoretically guaranteed algorithms for non-monotone submodular maximization with size a constraint, which run far faster and achieve as good objective values as existing algorithms.
READ FULL TEXT