Title :
Data classification with a relaxed model of variable kernel density estimation
Author :
Oyang, Yen-Jen ; Ou, Yu-Yen ; Hwang, Shien-Ching ; Chen, Chien-Yu ; Chang, Darby Tien-Hau
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
fDate :
31 July-4 Aug. 2005
Abstract :
In recent years, kernel density estimation has been exploited by computer scientists to model several important problems in machine learning, bioinformatics, and computer vision. However, in case the dimension of the data set is high, then the conventional kernel density estimators suffer poor convergence rates of the pointwise mean square error (MSE) and the integrated mean square error (IMSE). Therefore, design of a novel kernel density estimator that overcomes this problem has been a great challenge for many years. This paper proposes a relaxed model of the variable kernel density estimation and analyzes its performance in data classification applications. It is proved in this paper that, in terms of pointwise MSE, the convergence rate of the relaxed variable kernel density estimator can approach O(n-1) regardless of the dimension of the data set, where n is the number of sampling instances. Experiments with the data classification applications have shown that the improved convergence rate of the pointwise MSE leads to higher prediction accuracy. In fact, the experimental results have also shown that the data classifier constructed based on the relaxed variable kernel density estimator is capable of delivering the same level of prediction accuracy as the SVM with the Gaussian kernel.
Keywords :
convergence of numerical methods; data analysis; estimation theory; mean square error methods; relaxation theory; Gaussian kernel; RBF network; bioinformatics; computer vision; data classification; integrated mean square error; kernel density estimation; machine learning; pointwise mean square error; radial basis function; Accuracy; Bioinformatics; Computer vision; Convergence; Kernel; Machine learning; Mean square error methods; Performance analysis; Sampling methods; Support vector machines;
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
DOI :
10.1109/IJCNN.2005.1556374