Title :
Order Statistics Based Estimator for Renyi´s Entropy
Author :
Hegde, A. ; Tian Lan ; Erdogmus, D.
Author_Institution :
Dept. of Electr. & Comput. Eng., Florida Univ., Gainesville, FL
Abstract :
Several types of entropy estimators exist in the information theory literature. Most of these estimators explicitly involve estimating the density of the available data samples before computing the entropy. However, the entropy-estimator using sample spacing avoids this intermediate step and computes the entropy directly using the order-statistics. In this paper, we extend our horizon beyond Shannon´s definition of entropy and analyze the entropy estimation performance at higher orders of alpha, using Renyi´s generalized entropy estimator. We show that the estimators for higher orders of alpha better approximate the true entropy for an exponential family of distributions. Practical application of this estimator is demonstrated by computing mutual information between functionally coupled systems. During the estimation process, the joint distributions are decomposed into sum of their marginals by using linear ICA
Keywords :
entropy; exponential distribution; independent component analysis; Renyi entropy estimation; Shannon theory; exponential distribution; functionally coupled systems; information theory; linear independent component analysis; mutual information; order statistics; sample spacing; Brain modeling; Entropy; Independent component analysis; Information theory; Mutual coupling; Mutual information; Performance analysis; Sequences; Statistical distributions; Statistics; Entropy; Independent Component Analysis; Mutual Information; Order Statistics;
Conference_Titel :
Machine Learning for Signal Processing, 2005 IEEE Workshop on
Conference_Location :
Mystic, CT
Print_ISBN :
0-7803-9517-4
DOI :
10.1109/MLSP.2005.1532924