Author :
Wu, Yihong ; Verdú, Sergio
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
Abstract :
If N is standard Gaussian, the minimum mean-square error (MMSE) of estimating X based on √(snr)X + N vanishes at least as fast as 1/snr as snr → ∞. We define the MMSE dimension of X as the limit as snr → ∞ of the product of snr and the MMSE. For discrete, absolutely continuous or mixed X we show that the MMSE dimension equals Rényi´s information dimension. However, for singular X, we show that the product of snr and MMSE oscillates around information dimension periodically in snr (dB). We also show that discrete side information does not reduce MMSE dimension. These results extend considerably beyond Gaussian N under various technical conditions.
Keywords :
Gaussian processes; least mean squares methods; MMSE dimension; Renyi information dimension; discrete side information; minimum mean square error; standard Gaussian; Bayesian methods; Closed-form solution; Error analysis; Estimation theory; Gaussian noise; Mean square error methods; Statistical analysis; Statistical distributions; Taylor series; Upper bound;
Conference_Titel :
Information Theory Proceedings (ISIT), 2010 IEEE International Symposium on
Conference_Location :
Austin, TX
Print_ISBN :
978-1-4244-7890-3
Electronic_ISBN :
978-1-4244-7891-0
DOI :
10.1109/ISIT.2010.5513599