Manifold Learning: The Price of Normalization

06/16/2008
by   Y. Goldberg, et al.
0

We analyze the performance of a class of manifold-learning algorithms that find their output by minimizing a quadratic form under some normalization constraints. This class consists of Locally Linear Embedding (LLE), Laplacian Eigenmap, Local Tangent Space Alignment (LTSA), Hessian Eigenmaps (HLLE), and Diffusion maps. We present and prove conditions on the manifold that are necessary for the success of the algorithms. Both the finite sample case and the limit case are analyzed. We show that there are simple manifolds in which the necessary conditions are violated, and hence the algorithms cannot recover the underlying manifolds. Finally, we present numerical results that demonstrate our claims.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/03/2009

Isometric Multi-Manifolds Learning

Isometric feature mapping (Isomap) is a promising manifold learning meth...
research
06/07/2013

Spectral Convergence of the connection Laplacian from random samples

Spectral methods that are based on eigenvectors and eigenvalues of discr...
research
02/10/2023

Hessian Based Smoothing Splines for Manifold Learning

We propose a multidimensional smoothing spline algorithm in the context ...
research
12/30/2013

Total variation regularization for manifold-valued data

We consider total variation minimization for manifold valued data. We pr...
research
05/24/2023

Manifold Diffusion Fields

We present Manifold Diffusion Fields (MDF), an approach to learn generat...
research
12/16/2021

A new locally linear embedding scheme in light of Hessian eigenmap

We provide a new interpretation of Hessian locally linear embedding (HLL...
research
06/05/2017

The Geometry of Nodal Sets and Outlier Detection

Let (M,g) be a compact manifold and let -Δϕ_k = λ_k ϕ_k be the sequence ...

Please sign up or login with your details

Forgot password? Click here to reset