Author_Institution :
Sch. of Math. Sci., Queen Mary Univ. of London, London, UK
Abstract :
For discrete random variables X1, ..., Xn we construct an n by n matrix. In the (i, j)-entry we put the mutual information I(Xi ; Xj) between Xi and Xj. In particular, in the (i, i)-entry we put the entropy H(Xi) = I(Xi; Xi) of Xi. This matrix, called the mutual information matrix of (X1, ..., Xn), has been conjectured to be positive semidefinite. In this paper, we give counterexamples to the conjecture, and show that the conjecture holds for up to three random variables.
Keywords :
entropy; matrix algebra; discrete random variables; entropy; mutual information matrix; Computer science; Educational institutions; Eigenvalues and eigenfunctions; Entropy; Linear matrix inequalities; Mutual information; Random variables; Information inequalities; linear algebra; mutual information;