Class Mean Vector Component and Discriminant Analysis for Kernel Subspace Learning

12/14/2018
by   Alexandros Iosifidis, et al.
0

The kernel matrix used in kernel methods encodes all the information required for solving complex nonlinear problems defined on data representations in the input space using simple, but implicitly defined, solutions. Spectral analysis on the kernel matrix defines an explicit nonlinear mapping of the input data representations to a subspace of the kernel space, which can be used for directly applying linear methods. However, the selection of the kernel subspace is crucial for the performance of the proceeding processing steps. In this paper, we propose a component analysis method for kernel-based dimensionality reduction that optimally preserves the pair-wise distances of the class means in the feature space. We provide extensive analysis on the connection of the proposed criterion to those used in kernel principal component analysis and kernel discriminant analysis, leading to a discriminant analysis version of the proposed method. We illustrate the properties of the proposed approach on real-world data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset