Title :
Kullback-Leibler divergence rate between probability distributions on sets of different cardinalities
Author_Institution :
Dept. of Bioeng., Univ. of Texas at Dallas, Richardson, TX, USA
Abstract :
In this paper we generalize the familiar notion of the Kullback-Leibler divergence between two probability distribitions on a finite set to the case where the two probability distributions are defined on sets of different cardinalities. This is achieved by `dilating´ the distribution on the smaller set to one on the larger set. This idea follows. Then, using the log sum inequality, we give a formula for this divergence that permits all computations to be carried out on the set of smaller cardinality. However, computing the divergence is still in general an intractable problem. Hence we give a greedy algorithm for quickly obtaining an upper bound for this divergence. The ideas given here can be used to study the concept of an optimally aggregated model of a Markov or hidden Markov process. This is done in a companion paper.
Keywords :
Markov processes; entropy; set theory; statistical distributions; Kullback-Leibler divergence rate; Shannon entropy; cardinality; greedy algorithm; hidden Markov process; log sum inequality; probability distribution; Complexity theory; Entropy; Greedy algorithms; Markov processes; Mutual information; Probability distribution; Tin;
Conference_Titel :
Decision and Control (CDC), 2010 49th IEEE Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
978-1-4244-7745-6
DOI :
10.1109/CDC.2010.5716982