Local Component Analysis for Nonparametric Bayes Classifier

10/24/2010
by   Mahmoud Khademi, et al.
0

The decision boundaries of Bayes classifier are optimal because they lead to maximum probability of correct decision. It means if we knew the prior probabilities and the class-conditional densities, we could design a classifier which gives the lowest probability of error. However, in classification based on nonparametric density estimation methods such as Parzen windows, the decision regions depend on the choice of parameters such as window width. Moreover, these methods suffer from curse of dimensionality of the feature space and small sample size problem which severely restricts their practical applications. In this paper, we address these problems by introducing a novel dimension reduction and classification method based on local component analysis. In this method, by adopting an iterative cross-validation algorithm, we simultaneously estimate the optimal transformation matrices (for dimension reduction) and classifier parameters based on local information. The proposed method can classify the data with complicated boundary and also alleviate the course of dimensionality dilemma. Experiments on real data show the superiority of the proposed algorithm in term of classification accuracies for pattern classification applications like age, facial expression and character recognition. Keywords: Bayes classifier, curse of dimensionality dilemma, Parzen window, pattern classification, subspace learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset