• DocumentCode
    639934
  • Title

    Pointwise relations between information and estimation in the Poisson channel

  • Author

    Jiantao Jiao ; Venkat, Kartik ; Weissman, Tsachy

  • fYear
    2013
  • fDate
    7-12 July 2013
  • Firstpage
    449
  • Lastpage
    453
  • Abstract
    Identities yielding optimal estimation interpretations for mutual information and relative entropy - paralleling those known for minimum mean squared estimation under additive Gaussian noise - were recently discovered for the Poisson channel by Atar and Weissman. We express these identities as equalities between expectations of the associated estimation and information theoretic random variables such as the actual estimation loss and the information density. By explicitly characterizing the relations between these random variables we show that they are related in much stronger pointwise senses that directly imply the known expectation identities while deepening our understanding of them. As an example for the nature of our results, consider the equality between the mutual information and the mean cumulative filtering loss of the optimal filter in continuous-time estimation. We show that the difference between the information density and the cumulative filtering loss is a martingale expressible as a stochastic integral. This explicit characterization not only directly recovers the previously known expectation relation, but allows to characterize other distributional properties of the random variables involved where some of the original objects of interest emerge in new and surprising roles. For example, we find that the increasing predictable part of the Doob-Meyer decomposition of the information density (which is a sub-martinagle) is nothing but the cumulative loss of the optimal filter.
  • Keywords
    AWGN channels; estimation theory; least mean squares methods; Doob-Meyer decomposition; Poisson channel; additive Gaussian noise; continuous-time estimation; cumulative filtering loss; information density; minimum mean squared estimation; mutual information; optimal estimation interpretations; optimal filter; pointwise relations; relative entropy; Channel estimation; Entropy; Estimation; Mutual information; Signal to noise ratio; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Proceedings (ISIT), 2013 IEEE International Symposium on
  • Conference_Location
    Istanbul
  • ISSN
    2157-8095
  • Type

    conf

  • DOI
    10.1109/ISIT.2013.6620266
  • Filename
    6620266