DocumentCode :
1779527
Title :
Exact common information
Author :
Kumar, G. Ramesh ; Cheuk Ting Li ; El Gamal, Abbas
Author_Institution :
Electr. Eng., Stanford Univ., Stanford, CA, USA
fYear :
2014
fDate :
June 29 2014-July 4 2014
Firstpage :
161
Lastpage :
165
Abstract :
This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables (X, Y). We introduce the quantity G(X; Y) = minX→W→Y H(W) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2-DMS (X, Y). We give a multiletter characterization for it as the limit G̅(X; Y) = limn→∞(1/n)G(Xn; Yn). While in general G̅(X; Y) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general.
Keywords :
approximation theory; correlation theory; entropy; Wyner common information; correlated random variables; exact common information rate; exact distributed generation; exact generation; minimum description length; minimum description rate; multiletter characterization; single letter characterization; symmetric binary erasure source; Decoding; Distributed power generation; Educational institutions; Entropy; Information rates; Random variables;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory (ISIT), 2014 IEEE International Symposium on
Conference_Location :
Honolulu, HI
Type :
conf
DOI :
10.1109/ISIT.2014.6874815
Filename :
6874815
Link To Document :
بازگشت