DocumentCode :
1285997
Title :
Monotonic Convergence in an Information-Theoretic Law of Small Numbers
Author :
Yu, Yaming
Author_Institution :
Dept. of Stat., Univ. of California, Irvine, CA, USA
Volume :
55
Issue :
12
fYear :
2009
Firstpage :
5412
Lastpage :
5422
Abstract :
An "entropy increasing to the maximum" result analogous to the entropic central limit theorem (Barron 1986; Artstein 2004) is obtained in the discrete setting. This involves the thinning operation and a Poisson limit. Monotonic convergence in relative entropy is established for general discrete distributions, while monotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions. Overall we extend the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis (2005) and HarremoEumls (2007, 2008, 2009). Ingredients in the proofs include convexity, majorization, and stochastic orders.
Keywords :
Poisson distribution; entropy; Poisson limit; Shannon entropy; binomial thinning; entropic central limit theorem; information theory; monotonic convergence; Convergence; Convolution; Entropy; Gaussian channels; Helium; Information theory; Physics; Random variables; Stochastic processes; Thermodynamics; Binomial thinning; Poisson approximation; Schur-concavity; convex order; logarithmic Sobolev inequality; majorization; maximum entropy; relative entropy; ultra-log-concavity;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2009.2032727
Filename :
5319737
Link To Document :
بازگشت