Title :
Divergence based feature selection for multimodal class densities
Author :
Novovicová, Jana ; Pudil, Pavel ; Kittler, Josef
Author_Institution :
Inst. of Inf. Theory & Autom., Czechoslovak Acad. of Sci., Prague, Czech Republic
fDate :
2/1/1996 12:00:00 AM
Abstract :
A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized densities of a special type is presented. This procedure is suitable especially for multimodal data. Apart from finding a feature subset of any cardinality without involving any search procedure, it also simultaneously yields a pseudo-Bayes decision rule. Its performance is tested on real data
Keywords :
Bayes methods; decision theory; feature extraction; Kullback J-divergence; class conditional density functions; divergence-based feature selection; multimodal class densities; pseudo-Bayes decision rule; search procedure; Algorithm design and analysis; Approximation error; Automation; Density functional theory; Machine intelligence; Pattern recognition; Probability density function; Probability distribution; Testing; Usability;
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on