DocumentCode :
3512730
Title :
Mutual information, relative entropy, and estimation in the Poisson channel
Author :
Atar, Rami ; Weissman, Tsachy
fYear :
2011
fDate :
July 31 2011-Aug. 5 2011
Firstpage :
708
Lastpage :
712
Abstract :
Let X be a non-negative random variable and let the conditional distribution of a random variable Y, given X, be Poisson(γ·X), for a parameter γ ≥ 0. We identify a natural loss function such that: 1. The derivative of the mutual information between X and Y with respect to γ is equal to the minimum mean loss in estimating X based on Y, regardless of the distribution of X. 1 When X ~ P is estimated based on Y by a mismatched estimator that would have minimized the expected loss had X ~ Q, the integral over all values of γ of the excess mean loss is equal to the relative entropy between P and Q. For a continuous time setting where XT = {Xt, 0 ≤ t ≤ T} is a non-negative stochastic process and the conditional law of YT = {Yt, 0 ≤ t ≤ T}, given XT, is that of a non-homogeneous Poisson process with intensity function γ · XT, under the same loss function: 1. The minimum mean loss in causal filtering when γ = γ0 is equal to the expected value of the minimum mean loss in non-causal filtering (smoothing) achieved with a channel whose parameter γ is uniformly distributed between 0 and γ0. Bridging the two quantities is the mutual information between XT and YT. 2. This relationship between the mean losses in causal and non-causal filtering holds also in the case where the filters employed are mismatched, i.e., optimized assuming a law on XT which is not the true one. Bridging the two quantities in this case is the sum of the mutual information and the relative entropy between the true and the mismatched distribution of YT. Thus, relative entropy quantifies the excess estimation loss due to mismatch in this setting. These results parallel those recently found for the Gaussian channel: the I-MMSE relationship of Guo Shamai an- - d Verdii, the relative entropy and mismatched estimation relationship of Verdii, and the relationship between causal and non-casual mismatched estimation of Weissman.
Keywords :
Gaussian channels; Poisson equation; entropy; least mean squares methods; smoothing methods; Gaussian channel; I-MMSE relationship; Poisson channel; mismatched estimator; mutual information; natural loss function; noncausal filtering; nonhomogeneous Poisson process; nonnegative random variable; nonnegative stochastic process; relative entropy; smoothing; Channel estimation; Entropy; Estimation; Loss measurement; Mutual information; Random variables; Signal to noise ratio; Causal estimation; Divergence; Girsanov transformation; I-MMSE; Mismatched estimation; Mutual information; Nonlinear filtering; Point processes; Poisson channel; Relative entropy; Shannon theory; Statistics;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Proceedings (ISIT), 2011 IEEE International Symposium on
Conference_Location :
St. Petersburg
ISSN :
2157-8095
Print_ISBN :
978-1-4577-0596-0
Electronic_ISBN :
2157-8095
Type :
conf
DOI :
10.1109/ISIT.2011.6034225
Filename :
6034225
Link To Document :
بازگشت