DocumentCode :
1498651
Title :
A Least-Squares Framework for Component Analysis
Author :
De La Torre, Fernando
Author_Institution :
Robot. Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA
Volume :
34
Issue :
6
fYear :
2012
fDate :
6/1/2012 12:00:00 AM
Firstpage :
1041
Lastpage :
1055
Abstract :
Over the last century, Component Analysis (CA) methods such as Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Canonical Correlation Analysis (CCA), Locality Preserving Projections (LPP), and Spectral Clustering (SC) have been extensively used as a feature extraction step for modeling, classification, visualization, and clustering. CA techniques are appealing because many can be formulated as eigen-problems, offering great potential for learning linear and nonlinear representations of data in closed-form. However, the eigen-formulation often conceals important analytic and computational drawbacks of CA techniques, such as solving generalized eigen-problems with rank deficient matrices (e.g., small sample size problem), lacking intuitive interpretation of normalization factors, and understanding commonalities and differences between CA methods. This paper proposes a unified least-squares framework to formulate many CA methods. We show how PCA, LDA, CCA, LPP, SC, and its kernel and regularized extensions correspond to a particular instance of least-squares weighted kernel reduced rank regression (LS--WKRRR). The LS-WKRRR formulation of CA methods has several benefits: 1) provides a clean connection between many CA techniques and an intuitive framework to understand normalization factors; 2) yields efficient numerical schemes to solve CA techniques; 3) overcomes the small sample size problem; 4) provides a framework to easily extend CA methods. We derive weighted generalizations of PCA, LDA, SC, and CCA, and several new CA techniques.
Keywords :
correlation methods; data visualisation; feature extraction; learning (artificial intelligence); least squares approximations; matrix algebra; pattern classification; pattern clustering; principal component analysis; regression analysis; CA techniques; canonical correlation analysis; classification; data nonlinear representation learning; eigen-formulation; eigen-problems; feature extraction step; least-squares framework; least-squares weighted kernel reduced rank regression; linear discriminant analysis; locality preserving projections; modeling; normalization factor intuitive interpretation lackness; numerical schemes; principal component analysis; rank deficient matrices; small sample size problem; spectral clustering; visualization; Algorithm design and analysis; Analytical models; Covariance matrix; Equations; Kernel; Mathematical model; Principal component analysis; Principal component analysis; canonical correlation analysis; dimensionality reduction.; k-means; kernel methods; linear discriminant analysis; reduced rank regression; spectral clustering; Algorithms; Humans; Least-Squares Analysis; Pattern Recognition, Automated; Principal Component Analysis; Sample Size;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.2011.184
Filename :
6186732
Link To Document :
بازگشت