Riemannian optimization using three different metrics for Hermitian PSD fixed-rank constraints: an extended version
We consider smooth optimization problems with a Hermitian positive semi-definite fixed-rank constraint, where a quotient geometry with three Riemannian metrics g^i(·, ·) (i=1,2,3) is used to represent this constraint. By taking the nonlinear conjugate gradient method (CG) as an example, we show that CG on the quotient geometry with metric g^1 is equivalent to CG on the factor-based optimization framework, which is often called the Burer–Monteiro approach. We also show that CG on the quotient geometry with metric g^3 is equivalent to CG on the commonly-used embedded geometry. We call two CG methods equivalent if they produce an identical sequence of iterates {X_k}. In addition, we show that if the limit point of the sequence {X_k} generated by an algorithm has lower rank, that is X_k∈ℂ^n× n, k = 1, 2, … has rank p and the limit point X_* has rank r < p, then the condition number of the Riemannian Hessian with metric g^1 can be unbounded, but those of the other two metrics stay bounded. Numerical experiments show that the Burer–Monteiro CG method has slower local convergence rate if the limit point has a reduced rank, compared to CG on the quotient geometry under the other two metrics. This slower convergence rate can thus be attributed to the large condition number of the Hessian near a minimizer.
READ FULL TEXT