• DocumentCode
    905011
  • Title

    Logarithmic Sobolev Inequalities for Information Measures

  • Author

    Kitsos, Christos P. ; Tavoularis, Nikolaos K.

  • Author_Institution
    Dept. of Math., Technol. Educ. Inst. of Athens, Athens
  • Volume
    55
  • Issue
    6
  • fYear
    2009
  • fDate
    6/1/2009 12:00:00 AM
  • Firstpage
    2554
  • Lastpage
    2561
  • Abstract
    For alpha ges 1, the new Vajda-type information measure J alpha (X) is a quantity generalizing Fisher´s information (FI), to which it is reduced for alpha = 2 . In this paper, a corresponding generalized entropy power N alpha (X) is introduced, and the inequality N alpha (X) J alpha(X) ges n is proved, which is reduced to the well-known inequality of Stam for alpha = 2. The cases of equality are also determined. Furthermore, the Blachman-Stam inequality for the FI of convolutions is generalized for the Vajda information J alpha (X) and both families of results in the context of measure of information are discussed. That is, logarithmic Sobolev inequalities (LSIs) are written in terms of new more general entropy-type information measure, and therefore, new information inequalities are arisen. This generalization for special cases yields to the well known information measures and relative bounds.
  • Keywords
    entropy; Blachman-Stam inequality; Fisher information; Vajda-type information measure; entropy; logarithmic Sobolev inequalities; Calibration; Cramer-Rao bounds; Density measurement; Design for experiments; Educational technology; Entropy; Gaussian distribution; Information theory; Mathematics; Statistical analysis; Fisher´s measure of information; Shannon and RÉnyi entropy; Vajda measure of information; logarithmic Sobolev inequalities (LSIs);
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2009.2018179
  • Filename
    4957631