Quadratic Decomposable Submodular Function Minimization
We introduce a new convex optimization problem, termed quadratic decomposable submodular function minimization. The problem arises in many learning on graphs and hypergraphs settings and is closely related to decomposable submodular function minimization. We approach the problem via a new dual strategy and describe an objective that may be optimized via random coordinate descent (RCD) methods and projections onto cones. We also establish the linear convergence rate of the RCD algorithm and develop efficient projection algorithms with provable performance guarantees. Numerical experiments in transductive learning on hypergraphs confirm the efficiency of the proposed algorithm and demonstrate the significant improvements in prediction accuracy with respect to state-of-the-art methods.
READ FULL TEXT