Title :
Input-constrained erasure channels: Mutual information and capacity
Author :
Yonglong Li ; Guangyue Han
Author_Institution :
Univ. of Hong Kong, Hong Kong, China
fDate :
June 29 2014-July 4 2014
Abstract :
In this paper, we derive an explicit formula for the entropy rate of a hidden Markov chain, observed when the Markov chain passes through a memoryless erasure channel. This result naturally leads to an explicit formula for the mutual information rate of memoryless erasure channels with Markovian inputs. Moreover, if the input Markov chain is of first-order and supported on the (1, ∞)-run length limited (RLL) constraint, we show that the mutual information rate is strictly concave with respect to a chosen parameter. Then we apply a recent algorithm [1] to approximately compute the first-order noisy constrained channel capacity and the corresponding capacity-achieving distribution.
Keywords :
channel capacity; hidden Markov models; runlength codes; Markovian inputs; first-order noisy constrained channel capacity; hidden Markov chain; input-constrained erasure channel; memoryless erasure channel; run length limited constraint; Approximation algorithms; Channel capacity; Hidden Markov models; Markov processes; Mutual information; Noise measurement;
Conference_Titel :
Information Theory (ISIT), 2014 IEEE International Symposium on
Conference_Location :
Honolulu, HI
DOI :
10.1109/ISIT.2014.6875399