O(k)-Equivariant Dimensionality Reduction on Stiefel Manifolds

09/19/2023
by   Andrew Lee, et al.
0

Many real-world datasets live on high-dimensional Stiefel and Grassmannian manifolds, V_k(ℝ^N) and Gr(k, ℝ^N) respectively, and benefit from projection onto lower-dimensional Stiefel (respectively, Grassmannian) manifolds. In this work, we propose an algorithm called Principal Stiefel Coordinates (PSC) to reduce data dimensionality from V_k(ℝ^N) to V_k(ℝ^n) in an O(k)-equivariant manner (k ≤ n ≪ N). We begin by observing that each element α∈ V_n(ℝ^N) defines an isometric embedding of V_k(ℝ^n) into V_k(ℝ^N). Next, we optimize for such an embedding map that minimizes data fit error by warm-starting with the output of principal component analysis (PCA) and applying gradient descent. Then, we define a continuous and O(k)-equivariant map π_α that acts as a “closest point operator” to project the data onto the image of V_k(ℝ^n) in V_k(ℝ^N) under the embedding determined by α, while minimizing distortion. Because this dimensionality reduction is O(k)-equivariant, these results extend to Grassmannian manifolds as well. Lastly, we show that the PCA output globally minimizes projection error in a noiseless setting, but that our algorithm achieves a meaningfully different and improved outcome when the data does not lie exactly on the image of a linearly embedded lower-dimensional Stiefel manifold as above. Multiple numerical experiments using synthetic and real-world data are performed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset