Title :
Chain Independence and Common Information
Author :
Makarychev, Konstantin ; Makarychev, Yury
Author_Institution :
Microsoft Res., Redmond, WA, USA
Abstract :
We present a new proof of a celebrated result of Gács and Körner that the common information is far less than the mutual information. Consider two sequences α1,... αn and β1,... βn of random variables, where pairs (α1, β1),... (αn, βn) are independent and identically distributed. Gács and Körner proved that it is not possible to extract “common information” from these two sequences unless the joint distribution matrix of random variables (αi, βi) is a block matrix. In 2000, Romashchenko introduced a notion of chain independent random variables and gave a simple proof of the result of Gács and Körner for chain independent random variables. Furthermore, Romashchenko showed that Boolean random variables α and β are chain independent unless α = β a.s. or α = 1 - β a.s. In this paper, we generalize this result to arbitrary (finite) distributions of α and β and thus give a simple proof of the result of Gács and Körner.
Keywords :
Boolean functions; information theory; random processes; Boolean random variables; arbitrary distributions; chain independence; chain independent random variables; common information; finite distributions; joint distribution matrix; mutual information; Data mining; Joints; Linear matrix inequalities; Materials; Matrix decomposition; Random variables; Symmetric matrices; Chain independent random variables; common information; rate region;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2012.2196022