DocumentCode :
1085997
Title :
Estimating entropy on m bins given fewer than m samples
Author :
Paninski, Liam
Author_Institution :
Univ. Coll. London, UK
Volume :
50
Issue :
9
fYear :
2004
Firstpage :
2200
Lastpage :
2203
Abstract :
Consider a sequence pN of discrete probability measures, supported on mN points, and assume that we observe N independent and identically distributed (i.i.d.) samples from each pN. We demonstrate the existence of an estimator of the entropy, H(pN), which is consistent even if the ratio N/mN is bounded (and, as a corollary, even if this ratio tends to zero, albeit at a sufficiently slow rate).
Keywords :
approximation theory; entropy; probability; approximation theory; discrete probability measures; distribution-free bound; entropy estimation; independent-identically distributed sample; Entropy; Maximum likelihood estimation; Power measurement; State estimation; Statistics; Approximation theory; bias; consistency; distribution-free bounds; entropy; estimation;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2004.833360
Filename :
1327826
Link To Document :
بازگشت