DocumentCode :
2058033
Title :
Mutual information and MMSE in gaussian channels
Author :
Guo, Dongning ; Shamai, Shlomo ; Verdú, Sergio
Author_Institution :
Princeton Univ., NJ
fYear :
2004
fDate :
2004
Firstpage :
349
Lastpage :
349
Abstract :
Consider arbitrarily distributed input signals observed in additive Gaussian noise. A new fundamental relationship is found between the input-output mutual information and the minimum mean-square error (MMSE) of an estimate of the input given the output: The derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE. This identity holds for both scalar and vector signals, as well as for discrete- and continuous-time noncausal MMSE estimation (smoothing). A consequence of the result is a new relationship in continuous-time nonlinear filtering: Regardless of the input statistics, the causal MMSE achieved at snr is equal to the expected value of the noncausal MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and snr
Keywords :
AWGN channels; least mean squares methods; Gaussian channels; MMSE; arbitrarily distributed input signals; discrete-continuous-time noncausal estimation; input-output mutual information; minimum mean-square error; signal-to-noise ratio; Additive noise; Computer errors; Filtering; Gaussian channels; Gaussian noise; Mutual information; Network address translation; Signal to noise ratio; Smoothing methods; Statistical distributions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2004. ISIT 2004. Proceedings. International Symposium on
Conference_Location :
Chicago, IL
Print_ISBN :
0-7803-8280-3
Type :
conf
DOI :
10.1109/ISIT.2004.1365386
Filename :
1365386
Link To Document :
بازگشت