DocumentCode :
2943960
Title :
Proof of Entropy Power Inequalities Via MMSE
Author :
Guo, Dongning ; Shamai, Shlomo ; Verdu, Sergio
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Northwestern Univ., Evanston, IL
fYear :
2006
fDate :
9-14 July 2006
Firstpage :
1011
Lastpage :
1015
Abstract :
The differential entropy of a random variable (or vector) can be expressed as the integral over signal-to-noise ratio (SNR) of the minimum mean-square error (MMSE) of estimating the variable (or vector) when observed in additive Gaussian noise. This representation sidesteps Fisher´s information to provide simple and insightful proofs for Shannon´s entropy power inequality (EPI) and two of its variations: Costa´s strengthened EPI in the case in which one of the variables is Gaussian, and a generalized EPI for linear transformations of a random vector due to Zamir and Feder
Keywords :
Gaussian noise; entropy; least mean squares methods; MMSE; SNR; additive Gaussian noise; differential entropy; entropy power inequalities; minimum mean-square error; random vector; signal-to-noise ratio; Additive noise; Computer science; Entropy; Gaussian channels; Gaussian noise; Mutual information; Pollution measurement; Random variables; Signal to noise ratio; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2006 IEEE International Symposium on
Conference_Location :
Seattle, WA
Print_ISBN :
1-4244-0505-X
Electronic_ISBN :
1-4244-0504-1
Type :
conf
DOI :
10.1109/ISIT.2006.261880
Filename :
4036117
Link To Document :
بازگشت