Title :
On feature extraction by mutual information maximization
Author_Institution :
Motorola Labs, 7700 South River Parkway, MD ML28, Tempe AZ 85284, USA
Abstract :
In order to learn discriminative feature transforms, we discuss mutual information between class labels and transformed features as a criterion. Instead of Shannon´s definition we use measures based on Renyi entropy, which lends itself into an efficient implementation and an interpretation of “information potentials” and “information forces” induced by samples of data. This paper presents two routes towards practical usability of the method, especially aimed to large databases: The first is an on-line stochastic gradient algorithm, and the second is based on approximating class densities in the output space by Gaussian mixture models.
Keywords :
Computational modeling; Entropy; Feature extraction; Helium; Hidden Markov models; Markov processes; Transforms;
Conference_Titel :
Acoustics, Speech, and Signal Processing (ICASSP), 2002 IEEE International Conference on
Conference_Location :
Orlando, FL, USA
Print_ISBN :
0-7803-7402-9
DOI :
10.1109/ICASSP.2002.5743865