Title :
Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains
Author :
Ciuperca, Gabriela ; Girardin, Valerie ; Lhote, Loïck
Author_Institution :
Lab. de Probabilite, Combinatoire et Statistique, Univ. de Lyon, Villeurbanne, France
fDate :
7/1/2011 12:00:00 AM
Abstract :
We study entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones. In the first part, we obtain an explicit formula for the entropy rate for a large class of entropy functionals, as soon as the process satisfies a regularity property known in dynamical systems theory as the quasi-power property. Independent and identically distributed sequence of random variables naturally satisfy this property. Markov chains are proven to satisfy it, too, under simple explicit conditions on their transition probabilities. All the entropy rates under study are thus shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rényi entropy rates up to a multiplicative constant. In the second part, we focus on the estimation of the marginal generalized entropy and entropy rate for parametric Markov chains. Estimators with good asymptotic properties are built through a plug-in procedure using a maximum likelihood estimation of the parameter.
Keywords :
Markov processes; entropy; maximum likelihood estimation; probability; random sequences; Renyi entropy; Shannon entropy; Tsallis entropy; denumerable Markov chains; generalized entropy rates; maximum likelihood estimation; plug-in procedure; probability; random sequences; random variables; Density functional theory; Eigenvalues and eigenfunctions; Entropy; Estimation; Markov processes; Random sequences; Random variables; Entropy functional; Rényi entropy; Tsallis entropy; entropy rate; parametric Markov chain; plug-in estimation;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2011.2133710