Title :
Estimating Mutual Information Via Kolmogorov Distance
Author_Institution :
Carleton Univ., Ottawa
Abstract :
By use of a coupling technique, two inequalities are established which set upper bounds to the mutual information of finite discrete random variables in terms of the Kolmogorov distance (variational distance).
Keywords :
entropy; random processes; Kolmogorov distance; Shannon entropy; coupling technique; finite discrete random variables; set upper bounds; variational distance; Entropy; Information theory; Mathematics; Mutual coupling; Mutual information; Probability distribution; Random variables; Statistical distributions; Testing; Upper bound; Kolmogorov distance; Shannon entropy; mutual information;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2007.903122