Differentiable Greedy Submodular Maximization: Guarantees, Gradient Estimators, and Applications

05/06/2020
by   Shinsaku Sakaue, et al.
0

We consider making outputs of the greedy algorithm for monotone submodular function maximization differentiable w.r.t. parameters of objective functions; this is motivated by many applications, e.g., sensitivity analysis and end-to-end learning. Our contribution is a theoretically guaranteed and widely applicable smoothing framework based on randomization. We prove that our smoothed greedy algorithm almost recovers original approximation guarantees in expectation for the cases of cardinality and κ-extensible system constrains. We also show how to efficiently compute unbiased gradient estimators of any expected output-dependent quantities by sampling outputs. We demonstrate the utility and effectiveness of our framework by applying it to various situations including the aforementioned ones.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset