Title :
A measure of mutual information on the time-frequency plane
Author_Institution :
Dept. of Electr. & Comput. Eng., Michigan State Univ., East Lansing, MI, USA
Abstract :
Information-theoretic characterization of time-frequency distributions have been successful at quantifying the complexity of nonstationary signals. Information measures such as entropy and divergence have been adapted to the time-frequency domain for counting the number of signal components, evaluating the performance of different kernels and discriminating between signals based on their information content. Inspired by the success of these measures and in order to develop a more comprehensive information processing theory on the time-frequency plane, we introduce a mutual information measure for time-frequency distributions. The properties of this measure are derived and its application to signal classification problems is illustrated with examples.
Keywords :
entropy; signal classification; signal representation; statistical distributions; time-frequency analysis; divergence; entropy; mutual information measure; nonstationary signals; performance; signal classification; signal components; time-frequency distributions; time-frequency plane; Entropy; Information processing; Information representation; Kernel; Mutual information; Pattern classification; Random variables; Signal processing; Source separation; Time frequency analysis;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2005. Proceedings. (ICASSP '05). IEEE International Conference on
Print_ISBN :
0-7803-8874-7
DOI :
10.1109/ICASSP.2005.1416050