DocumentCode
2275312
Title
Approximations for the entropy rate of a hidden Markov process
Author
Ordentlich, Erik ; Weissman, Tsachy
Author_Institution
Hewlett-Packard Lab., Palo Alto, CA
fYear
2005
fDate
4-9 Sept. 2005
Firstpage
2198
Lastpage
2202
Abstract
Let {Xt} be a stationary finite-alphabet Markov chain and {Zt} denote its noisy version when corrupted by a discrete memoryless channel. We present an approach to bounding the entropy rate of {Zt} by the construction and study of a related measure-valued Markov process. To illustrate its efficacy, we specialize it to the case of a BSC-corrupted binary Markov chain. The bounds obtained are sufficiently tight to characterize the behavior of the entropy rate in asymptotic regimes that exhibit a "concentration of the support". Examples include the \´high SNR\´, \´low SNR\´, \´rare spikes\´, and \´weak dependence\´ regimes. Our analysis also gives rise to a deterministic algorithm for approximating the entropy rate, achieving the best known precision-complexity tradeoff, for a significant subset of the process parameter space
Keywords
deterministic algorithms; discrete systems; entropy; hidden Markov models; memoryless systems; BSC-corrupted binary Markov chain; SNR; asymptotic regimes; deterministic algorithm; discrete memoryless channel; entropy rate approximations; hidden Markov process; noisy version; precision-complexity tradeoff; rare spikes; stationary finite-alphabet Markov chain; weak dependence regime; Algorithm design and analysis; Channel coding; Entropy; Hidden Markov models; Integral equations; Kernel; Laboratories; Markov processes; Memoryless systems; Random variables;
fLanguage
English
Publisher
ieee
Conference_Titel
Information Theory, 2005. ISIT 2005. Proceedings. International Symposium on
Conference_Location
Adelaide, SA
Print_ISBN
0-7803-9151-9
Type
conf
DOI
10.1109/ISIT.2005.1523737
Filename
1523737
Link To Document