• DocumentCode
    1506504
  • Title

    Data-Processing Inequalities Based on a Certain Structured Class of Information Measures With Application to Estimation Theory

  • Author

    Merhav, Neri

  • Author_Institution
    Dept. of Electr. Eng., Technion - Israel Inst. of Technol., Haifa, Israel
  • Volume
    58
  • Issue
    8
  • fYear
    2012
  • Firstpage
    5287
  • Lastpage
    5301
  • Abstract
    We study data-processing inequalities that are derived from a certain class of generalized information measures, where a series of convex functions and multiplicative likelihood ratios is nested alternately. While these information measures can be viewed as a special case of the most general Zakai-Ziv generalized information measure, this special nested structure calls for attention and motivates our study. Specifically, a certain choice of the convex functions leads to an information measure that extends the notion of the Bhattacharyya distance (or the Chernoff divergence): While the ordinary Bhattacharyya distance is based on the (weighted) geometric mean of two replicas of the channel´s conditional distribution, the more general information measure allows an arbitrary number of such replicas. We apply the data-processing inequality induced by this information measure to a detailed study of lower bounds of parameter estimation under additive white Gaussian noise (AWGN) and show that in certain cases, tighter bounds can be obtained by using more than two replicas. While the resulting lower bound may not compete favorably with the best bounds available for the ordinary AWGN channel, the advantage of the new lower bound, relative to the other bounds, becomes significant in the presence of channel uncertainty, like unknown fading. This different behavior in the presence of channel uncertainty is explained by the convexity property of the information measure.
  • Keywords
    AWGN channels; estimation theory; geometry; AWGN channel; Bhattacharyya distance; Chernoff divergence; Zakai-Ziv generalized information measure; additive white Gaussian noise; channel uncertainty; convex function; convexity property; data-processing inequalities; data-processing inequality; estimation theory; multiplicative likelihood ratio; nested structure; structured class; weighted geometric mean; Channel coding; Convex functions; Data processing; Mutual information; Random variables; Uncertainty; Bhattacharyya distance; Chernoff divergence; Gallager function; data-processing inequality; fading; parameter estimation;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2012.2197175
  • Filename
    6193211