DocumentCode :
943712
Title :
Conditional limit theorems under Markov conditioning
Author :
Csiszár, Imre ; Cover, Thomas M. ; Choi, Byoung-seon
Volume :
33
Issue :
6
fYear :
1987
fDate :
11/1/1987 12:00:00 AM
Firstpage :
788
Lastpage :
801
Abstract :
Let X_{1},X_{2},\\cdots be independent identically distributed random variables taking values in a finite set X and consider the conditional joint distribution of the first m elements of the sample X_{1},\\cdots , X_{n} on the condition that X_{1}=x_{1} and the sliding block sample average of a function h(\\cdot , \\cdot) defined on X^{2} exceeds a threshold \\alpha > Eh(X_{1}, X_{2}) . For m fixed and n \\rightarrow \\infty , this conditional joint distribution is shown to converge m the m -step joint distribution of a Markov chain started in x_{1} which is closest to X_{l}, X_{2}, \\cdots in Kullback-Leibler information divergence among all Markov chains whose two-dimensional stationary distribution P(\\cdot , \\cdot) satisfies \\sum P(x, y)h(x, y)\\geq \\alpha , provided some distribution P on X_{2} having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained when X_{1}, X_{2},\\cdots is an arbitrary finite-order Markov chain and more general conditioning is allowed.
Keywords :
Information theory; Markov processes; Maximum-entropy methods; Convergence; Entropy; Information theory; Probability distribution; State-space methods; Statistics; Sufficient conditions;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1987.1057385
Filename :
1057385
Link To Document :
بازگشت