Local Component Analysis for Nonparametric Bayes Classifier

by   Mahmoud Khademi, et al.

The decision boundaries of Bayes classifier are optimal because they lead to maximum probability of correct decision. It means if we knew the prior probabilities and the class-conditional densities, we could design a classifier which gives the lowest probability of error. However, in classification based on nonparametric density estimation methods such as Parzen windows, the decision regions depend on the choice of parameters such as window width. Moreover, these methods suffer from curse of dimensionality of the feature space and small sample size problem which severely restricts their practical applications. In this paper, we address these problems by introducing a novel dimension reduction and classification method based on local component analysis. In this method, by adopting an iterative cross-validation algorithm, we simultaneously estimate the optimal transformation matrices (for dimension reduction) and classifier parameters based on local information. The proposed method can classify the data with complicated boundary and also alleviate the course of dimensionality dilemma. Experiments on real data show the superiority of the proposed algorithm in term of classification accuracies for pattern classification applications like age, facial expression and character recognition. Keywords: Bayes classifier, curse of dimensionality dilemma, Parzen window, pattern classification, subspace learning.


Sufficient Component Analysis for Supervised Dimension Reduction

The purpose of sufficient dimension reduction (SDR) is to find the low-d...

Facial Expression Representation and Recognition Using 2DHLDA, Gabor Wavelets, and Ensemble Learning

In this paper, a novel method for representation and recognition of the ...

Principal component analysis in Bayes spaces for sparsely sampled density functions

This paper presents a novel approach to functional principal component a...

Smile and Laugh Expressions Detection Based on Local Minimum Key Points

In this paper, a smile and laugh facial expression is presented based on...

Deep Dimension Reduction for Supervised Representation Learning

The success of deep supervised learning depends on its automatic data re...

Learning Densities Conditional on Many Interacting Features

Learning a distribution conditional on a set of discrete-valued features...

A novel extension of Generalized Low-Rank Approximation of Matrices based on multiple-pairs of transformations

Dimension reduction is a main step in learning process which plays a ess...

Please sign up or login with your details

Forgot password? Click here to reset