Title :
Functional Properties of Minimum Mean-Square Error and Mutual Information
Author :
Wu, Yihong ; Verdú, Sergio
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
fDate :
3/1/2012 12:00:00 AM
Abstract :
In addition to exploring its various regularity properties, we show that the minimum mean-square error (MMSE) is a concave functional of the input-output joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipschitz continuous with respect to the quadratic Wasserstein distance for peak-limited inputs. Regularity properties of mutual information are also obtained. Several applications to information theory and the central limit theorem are discussed.
Keywords :
Gaussian noise; information theory; least mean squares methods; MMSE; additive Gaussian noise; information theory; input-output joint distribution; minimum mean-square error; mutual information; peak-limited inputs; quadratic Wasserstein distance; Additives; Convergence; Entropy; Gaussian noise; Joints; Mutual information; Random variables; Bayesian statistics; Gaussian noise; central limit theorem; minimum mean-square error (MMSE); mutual information; non-Gaussian noise;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2011.2174959