Curvature-Dependant Global Convergence Rates for Optimization on Manifolds of Bounded Geometry

08/06/2020
by   Mario Lezcano Casado, et al.
0

We give curvature-dependant convergence rates for the optimization of weakly convex functions defined on a manifold of 1-bounded geometry via Riemannian gradient descent and via the dynamic trivialization algorithm. In order to do this, we give a tighter bound on the norm of the Hessian of the Riemannian exponential than the previously known. We compute these bounds explicitly for some manifolds commonly used in the optimization literature such as the special orthogonal group and the real Grassmannian. Along the way, we present self-contained proofs of fully general bounds on the norm of the differential of the exponential map and certain cosine inequalities on manifolds, which are commonly used in optimization on manifolds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset