Abstract :
Consider a sequence pN of discrete probability measures, supported on mN points, and assume that we observe N independent and identically distributed (i.i.d.) samples from each pN. We demonstrate the existence of an estimator of the entropy, H(pN), which is consistent even if the ratio N/mN is bounded (and, as a corollary, even if this ratio tends to zero, albeit at a sufficiently slow rate).
Keywords :
approximation theory; entropy; probability; approximation theory; discrete probability measures; distribution-free bound; entropy estimation; independent-identically distributed sample; Entropy; Maximum likelihood estimation; Power measurement; State estimation; Statistics; Approximation theory; bias; consistency; distribution-free bounds; entropy; estimation;