DocumentCode :
1243357
Title :
Divergence based feature selection for multimodal class densities
Author :
Novovicová, Jana ; Pudil, Pavel ; Kittler, Josef
Author_Institution :
Inst. of Inf. Theory & Autom., Czechoslovak Acad. of Sci., Prague, Czech Republic
Volume :
18
Issue :
2
fYear :
1996
fDate :
2/1/1996 12:00:00 AM
Firstpage :
218
Lastpage :
223
Abstract :
A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized densities of a special type is presented. This procedure is suitable especially for multimodal data. Apart from finding a feature subset of any cardinality without involving any search procedure, it also simultaneously yields a pseudo-Bayes decision rule. Its performance is tested on real data
Keywords :
Bayes methods; decision theory; feature extraction; Kullback J-divergence; class conditional density functions; divergence-based feature selection; multimodal class densities; pseudo-Bayes decision rule; search procedure; Algorithm design and analysis; Approximation error; Automation; Density functional theory; Machine intelligence; Pattern recognition; Probability density function; Probability distribution; Testing; Usability;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/34.481557
Filename :
481557
Link To Document :
بازگشت