DocumentCode :
2020893
Title :
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
Author :
Rioul, O.
Author_Institution :
Dept. ComElec, Paris Tech Inst. & CNRS LTCI, Paris
fYear :
2007
fDate :
24-29 June 2007
Firstpage :
46
Lastpage :
50
Abstract :
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon´s entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher´s information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.
Keywords :
entropy; least mean squares methods; Fisher information; Shannon entropy power inequality; information theoretic inequality; minimum mean-square error; Additive noise; Broadcasting; Entropy; Fasteners; MIMO; Mutual information; Probability density function; Random variables; Source coding; Telecommunications;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2007. ISIT 2007. IEEE International Symposium on
Conference_Location :
Nice
Print_ISBN :
978-1-4244-1397-3
Type :
conf
DOI :
10.1109/ISIT.2007.4557202
Filename :
4557202
Link To Document :
بازگشت