Title :
Kernel Fisher Discriminants and Kernel Nearest Neighbor Classifiers: A Comparative Study for Large-Scale Learning Problems
Author :
Daqi, Gao ; Jie, Li
Author_Institution :
East China Univ. of Sci. & Technol., Shanghai
Abstract :
One of solutions for kernel Fisher discriminants (KFDs) to solve the large-scale learning problems is that all the training samples in a class are covered by multiple hyperspheres one by one, and each sphere should include as many samples from the class as possible. In that way, the KFDs can be carried out in some relatively lower dimensional kernel space. This paper clarifies the fact that a nonlinearly separable dataset in the input space does not certainly become linearly separable in the kernel space. We thus propose a kernel nearest neighbor classification method, i.e., a sample is labeled according to the minimum distance between it and the surfaces of the existing kernels. The experimental results for the letter and the handwritten digit recognitions show that the presented method is quite effective for solving the large-scale learning problems.
Keywords :
learning (artificial intelligence); pattern classification; handwritten digit recognitions; kernel Fisher discriminants; kernel nearest neighbor classifiers; large-scale learning problems; multiple hyperspheres; nonlinearly separable dataset; Euclidean distance; Handwriting recognition; Kernel; Large-scale systems; Multilayer perceptrons; Nearest neighbor searches; Prototypes; Radial basis function networks; Support vector machine classification; Support vector machines;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.246847