DocumentCode :
1271945
Title :
Derivative of Mutual Information at Zero SNR: The Gaussian-Noise Case
Author :
Wu, Yihong ; Guo, Dongning ; Verdú, Sergio
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
Volume :
57
Issue :
11
fYear :
2011
Firstpage :
7307
Lastpage :
7312
Abstract :
Assuming additive Gaussian noise, a general sufficient condition on the input distribution is established to guarantee that the ratio of mutual information to signal-to-noise ratio (SNR) goes to one half nat as SNR vanishes. The result allows SNR-dependent input distribution and side information.
Keywords :
AWGN; Gaussian channels; approximation theory; Gaussian channels; SNR-dependent input distribution; additive Gaussian noise; first-order approximation; incremental-SNR channel; mutual information derivative; signal-to-noise ratio; Channel capacity; Gaussian noise; Mutual information; Random variables; Reactive power; Signal to noise ratio; Upper bound; Gaussian noise; low-power regime; minimum mean-square error (MMSE); mutual information; signal-to-noise ratio (SNR);
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2011.2161752
Filename :
5953516
Link To Document :
بازگشت