DocumentCode :
1007999
Title :
Relative entropy between Markov transition rate matrices
Author :
Kesidis, G. ; Walrand, J.
Author_Institution :
Dept. of Electr. & Comput. Sci., California Univ., Berkeley, CA, USA
Volume :
39
Issue :
3
fYear :
1993
fDate :
5/1/1993 12:00:00 AM
Firstpage :
1056
Lastpage :
1057
Abstract :
The relative entropy between two Markov transition rate matrices is derived from sample path considerations. This relative entropy is interpreted as a level-2.5 large-deviations action functional. That is, the level-two large-deviations action functional for empirical distributions of continuous-time Markov chains can be derived from the relative entropy using the contraction mapping principle
Keywords :
Markov processes; entropy; information theory; Markov transition rate matrices; continuous-time Markov chains; contraction mapping; level-2.5 large-deviations action functional; relative entropy; sample path considerations; Entropy; Information theory; Matrices; Poisson equations; Random number generation; Random variables; Trajectory; Transforms;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.256516
Filename :
256516
Link To Document :
بازگشت