Title :
A binary analog to the entropy-power inequality
Author :
Shamai, Shlomo ; Wyner, Aaron D.
Author_Institution :
AT&T Bell Lab., Murray Hill, NJ, USA
fDate :
11/1/1990 12:00:00 AM
Abstract :
Let {Xn}, {Yn} be independent stationary binary random sequences with entropy H( X), H(Y), respectively. Let h(ζ)=-ζlogζ-(1-ζ)log(1-ζ), 0⩽ζ⩽1/2, be the binary entropy function and let σ(X)=h-1 (H(X)), σ(Y)=h-1 (H(Y)). Let zn=Xn⊕Yn , where ⊕ denotes modulo-2 addition. The following analog of the entropy-power inequality provides a lower bound on H(Z ), the entropy of {Zn}: σ(Z)⩾σ(X)*σ(Y), where σ(Z)=h-1 (H(Z)), and α*β=α(1-β)+β(1-α). When {Y n} are independent identically distributed, this reduces to Mrs. Gerber´s Lemma from A.D. Wyner and J. Ziv (1973)
Keywords :
entropy; information theory; random processes; binary analog; entropy-power inequality; independent stationary binary random sequences; information theory; modulo-2 addition; Binary sequences; Cities and towns; Entropy; Probability density function; Random sequences;
Journal_Title :
Information Theory, IEEE Transactions on