DocumentCode :
812124
Title :
Kernel Component Analysis Using an Epsilon-Insensitive Robust Loss Function
Author :
Alzate, Carlos ; Suykens, Johan A K
Author_Institution :
Dept. of Electr. Eng., Katholieke Univ. Leuven, Leuven
Volume :
19
Issue :
9
fYear :
2008
Firstpage :
1583
Lastpage :
1598
Abstract :
Kernel principal component analysis (PCA) is a technique to perform feature extraction in a high-dimensional feature space, which is nonlinearly related to the original input space. The kernel PCA formulation corresponds to an eigendecomposition of the kernel matrix: eigenvectors with large eigenvalues correspond to the principal components in the feature space. Starting from the least squares support vector machine (LS-SVM) formulation to kernel PCA, we extend it to a generalized form of kernel component analysis (KCA) with a general underlying loss function made explicit. For classical kernel PCA, the underlying loss function is L 2 . In this generalized form, one can plug in also other loss functions. In the context of robust statistics, it is known that the L 2 loss function is not robust because its influence function is not bounded. Therefore, outliers can skew the solution from the desired one. Another issue with kernel PCA is the lack of sparseness: the principal components are dense expansions in terms of kernel functions. In this paper, we introduce robustness and sparseness into kernel component analysis by using an epsilon-insensitive robust loss function. We propose two different algorithms. The first method solves a set of nonlinear equations with kernel PCA as starting points. The second method uses a simplified iterative weighting procedure that leads to solving a sequence of generalized eigenvalue problems. Simulations with toy and real-life data show improvements in terms of robustness together with a sparse representation.
Keywords :
eigenvalues and eigenfunctions; feature extraction; iterative methods; least squares approximations; matrix decomposition; nonlinear equations; principal component analysis; support vector machines; unsupervised learning; LS-SVM; PCA; eigendecomposition; eigenvalue; eigenvector; epsilon-insensitive robust loss function; feature extraction; high-dimensional feature space; iterative weighting procedure; kernel matrix; kernel principal component analysis; least squares support vector machine; nonlinear equation; unsupervised learning; Epsilon-insensitive loss function; kernel principal component analysis (PCA); least squares support vector machines (LS-SVM); loss function; robustness; sparseness; Algorithms; Artificial Intelligence; Computer Simulation; Models, Statistical; Pattern Recognition, Automated; Principal Component Analysis;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2008.2000443
Filename :
4570256
Link To Document :
بازگشت