The problem of finding a meaningful measure of the "common information" or "common randomness\´ of two discrete dependent random variables

is studied. The quantity

is defined as the minimum possible value of

where the minimum is taken over all distributions defining an auxiliary random variable

, a finite set, such that

are conditionally independent given

. The main result of the paper is contained in two theorems which show that

is i) the minimum

such that a sequence of independent copies of

can be efficiently encoded into three binary streams

with rates

, respectively,
![[\\sum R_i = H(X, Y)]](/images/tex/7024.gif)
and

recovered from

, and

recovered from

, i.e.,

is the common stream; ii) the minimum binary rate

of the common input to independent processors that generate an approximation to

.