Title of article :
Approximations of the standard principal components analysis and kernel PCA
Author/Authors :
Zhang، نويسنده , , Rui and Wang، نويسنده , , Wenjian and Ma، نويسنده , , Yichen، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
7
From page :
6531
To page :
6537
Abstract :
Principal component analysis (PCA) is a powerful technique for extracting structure from possibly high-dimensional data sets, while kernel PCA (KPCA) is the application of PCA in a kernel-defined feature space. For standard PCA and KPCA, if the size of dataset is large, it will need a very large memory to store kernel matrix and a lot of time to calculate eigenvalues and corresponding eigenvectors. The aim of this paper is to learn linear and nonlinear principal components by using a few partial data points and determine which data points can be used. To verify the performance of the proposed approaches, a series of experiments on artificial datasets and UCI benchmark datasets are accomplished. Simulation results demonstrate that the proposed approaches can compete with or outperform the standard PCA and KPCA in generalization ability but with much less memory and time consuming.
Keywords :
PCA , KPCA , Kernel space , Orthogonal projection
Journal title :
Expert Systems with Applications
Serial Year :
2010
Journal title :
Expert Systems with Applications
Record number :
2348347
Link To Document :
بازگشت