• DocumentCode
    1490730
  • Title

    Renyi´s divergence and entropy rates for finite alphabet Markov sources

  • Author

    Rached, Ziad ; Alajaji, Fady ; Campbell, L. Lorne

  • Author_Institution
    Dept. of Math. & Stat., Queen´´s Univ., Kingston, Ont., Canada
  • Volume
    47
  • Issue
    4
  • fYear
    2001
  • fDate
    5/1/2001 12:00:00 AM
  • Firstpage
    1553
  • Lastpage
    1561
  • Abstract
    In this work, we examine the existence and the computation of the Renyi divergence rate, limn→∞ 1/n Dα (p(n)||q(n)), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions p(n) and q(n), respectively. This yields a generalization of a result of Nemetz (1974) where he assumed that the initial probabilities under p(n) and q(n) are strictly positive. The main tools used to obtain the Renyi divergence rate are the theory of nonnegative matrices and Perron-Frobenius theory. We also provide numerical examples and investigate the limits of the Renyi divergence rate as α→1 and as α↓0. Similarly, we provide a formula for the Renyi entropy rate limn→∞ 1/n H α(p(n)) of Markov sources and examine its limits as α→1 and as α↓0. Finally, we briefly provide an application to source coding
  • Keywords
    Markov processes; entropy; matrix algebra; probability; source coding; Perron-Frobenius theory; Renyi divergence rate; Renyi entropy rate; Renyi´s divergence; entropy rates; nonnegative matrices; probability distributions; source coding; time-invariant finite-alphabet Markov sources; Councils; Distributed computing; Entropy; Information analysis; Information theory; Mathematics; Source coding; Statistics;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.923736
  • Filename
    923736