Title :
Mismatched Estimation and Relative Entropy
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
Abstract :
A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the distribution is Q, instead of P . This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P ||Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
Keywords :
Gaussian noise; entropy; estimation theory; mean square error methods; probability; signal processing; Gaussian noise; SNR; free probability; mean-square estimation error; mismatched estimation; mismatched estimator; mismatched minimum mean-square estimator; nonreal-valued random variables; relative entropy; signal-to-noise ratios; Entropy; Estimation error; Estimation theory; Gaussian noise; Information theory; Mutual information; Network address translation; Probability; Random variables; Signal to noise ratio; Divergence; Shannon theory; free probability; minimum mean- square error (MMSE) estimation; mutual information; relative entropy; statistics;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2010.2050800