DocumentCode
2017667
Title
On Relationship between Mutual Information and Variation
Author
Prelov, V.
Author_Institution
Inst. for Inf. Transm. Problems, Russian Acad. of Sci., Moscow
fYear
2007
fDate
24-29 June 2007
Firstpage
51
Lastpage
55
Abstract
Investigation of a relationship between the mutual information and variational distance, started in Pinsker paper [1], where an upper bound for the mutual information via variational distance was obtained, is here continued. We present a simple lower bound which is optimal or asymptotically optimal in some cases. An uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the case where the variational distance tends to zero or to its maximum value.
Keywords
entropy; random processes; asymptotic behaviour; entropy; mutual information; random variables; variational distance; Entropy; Equations; Mutual information; Random variables; Upper bound;
fLanguage
English
Publisher
ieee
Conference_Titel
Information Theory, 2007. ISIT 2007. IEEE International Symposium on
Conference_Location
Nice
Print_ISBN
978-1-4244-1397-3
Type
conf
DOI
10.1109/ISIT.2007.4557078
Filename
4557078
Link To Document