Title :
Mutual information and minimum mean-square error in Gaussian channels
Author :
Guo, Dongning ; Shamai, Shlomo ; VerdÙ, Sergio
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., NJ, USA
fDate :
4/1/2005 12:00:00 AM
Abstract :
This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and SNR.
Keywords :
Gaussian channels; Gaussian noise; information theory; least mean squares methods; nonlinear estimation; nonlinear filters; smoothing methods; Gaussian channels; MMSE; SNR; Wiener process; additive Gaussian noise channel; arbitrarily distributed finite-power input signal; continuous-time nonlinear estimation; minimum mean-square error; mutual information; noncausal smoothing; nonlinear filtering; signal-to-noise ratio; Additive noise; Filtering; Gaussian channels; Gaussian noise; Mutual information; Network address translation; Power filters; Signal to noise ratio; Smoothing methods; Statistics; Gaussian channel; Wiener process; minimum mean-square error (MMSE); mutual information; nonlinear filtering; optimal estimation; smoothing;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2005.844072