DocumentCode :
1117768
Title :
An Optimal Set of Discriminant Vectors
Author :
Foley, Donald H. ; Sammon, John W., Jr.
Author_Institution :
Pattern Analysis and Recognition Corporation
Issue :
3
fYear :
1975
fDate :
3/1/1975 12:00:00 AM
Firstpage :
281
Lastpage :
289
Abstract :
A new method for the extraction of features in a two-class pattern recognition problem is derived. The main advantage is that the method for selecting features is based entirely upon discrimination or separability as opposed to the more common approach of fitting. The classical example of fitting is the use of the eigenvectors of the lumped covariance matrix corresponding to the largest eigenvalues. In an analogous manner, the new technique selects discriminant vectors (or features) corresponding to the largest "discrim-values." The new method is compared to some of the more popular alternative techniques via both data-dependent and mathematical examples. In addition, a recursive method for obtaining the discriminant vectors is given.
Keywords :
Dimensionality reduction, discriminants, eigenvectors, feature extraction, feature ranking, feature selection, Karhunen-Loeve expansions, multivariate data-analysis, pattern classification, pattern recognition.; Covariance matrix; Eigenvalues and eigenfunctions; Feature extraction; Karhunen-Loeve transforms; Logic design; Pattern analysis; Pattern classification; Pattern recognition; Piecewise linear techniques; Vectors; Dimensionality reduction, discriminants, eigenvectors, feature extraction, feature ranking, feature selection, Karhunen-Loeve expansions, multivariate data-analysis, pattern classification, pattern recognition.;
fLanguage :
English
Journal_Title :
Computers, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9340
Type :
jour
DOI :
10.1109/T-C.1975.224208
Filename :
1672801
Link To Document :
بازگشت