Title :
Rescaling Entropy and Divergence Rates
Author :
Girardin, Valerie ; Lhote, Loick
Author_Institution :
Lab. de Math. Nicolas Oresme, Univ. de Caen Normandie, Caen, France
Abstract :
Based on rescaling by some suitable sequence instead of the number of time units, the usual notion of divergence rate is here extended to define and determine meaningful generalized divergence rates. Rescaling entropy rates appears as a special case. Suitable rescaling is naturally induced by the asymptotic behavior of the marginal divergences. Closed-form formulas are obtained as soon as the marginal divergences behave like powers of some analytical functions. A wide class of countable Markov chains is proved to satisfy this property. Most divergence and entropy functionals defined in the literature are concerned, e.g., the classical Shannon, Kullback-Leibler, Rényi, and Tsallis. For illustration purposes, Ferreri or Basu-Harris-Hjort-Jones - among others - are also considered.
Keywords :
Markov processes; entropy; Kullback-Leibler functional; Rényi functional; Shannon functional; Tsallis functional; asymptotic marginal divergence behavior; closed-form formulas; countable Markov chains; entropy rate rescaling; generalized divergence rates; Entropy; Information theory; Markov processes; Measurement; Polynomials; Random sequences; Tin; Divergence rate; Kullback-Leibler divergence; Markov chain; R??nyi entropy; Renyi entropy; Shannon entropy; Tsallis entropy; divergence rate; entropy functional; entropy rate;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2015.2476486