DIPPA: An improved Method for Bilinear Saddle Point Problems
This paper studies bilinear saddle point problems min_xmax_y g(x) + x^⊤Ay - h(y), where the functions g, h are smooth and strongly-convex. When the gradient and proximal oracle related to g and h are accessible, optimal algorithms have already been developed in the literature <cit.>. However, the proximal operator is not always easy to compute, especially in constraint zero-sum matrix games <cit.>. This work proposes a new algorithm which only requires the access to the gradients of g, h. Our algorithm achieves a complexity upper bound 𝒪̃( A_2/√(μ_x μ_y) + √(κ_x κ_y (κ_x + κ_y))) which has optimal dependency on the coupling condition number A_2/√(μ_x μ_y) up to logarithmic factors.
READ FULL TEXT