Author :
Kovacevic, Mladen ; Stanojevic, Ivan ; Senk, V.
Author_Institution :
Dept. of Electr. Eng., Univ. of Novi Sad, Novi Sad, Serbia
Abstract :
We investigate certain optimization problems for Shannon information measures, namely, minimization of joint and conditional entropies H(X, Y), H(X|Y), H(Y|X), and maximization of mutual information I(X; Y), over convex regions. When restricted to the so-called transportation polytopes (sets of distributions with fixed marginals), very simple proofs of NP-hardness are obtained for these problems because in that case they are all equivalent, and their connection to the well-known SUBSET SUM and PARTITION problems is revealed. The computational intractability of the more general problems over arbitrary polytopes is then a simple consequence. Further, a simple class of polytopes is shown over which the above problems are not equivalent and their complexity differs sharply, namely, minimization of H(X, Y) and H(Y|X) is trivial, while minimization of H(X|Y) and maximization of I(X; Y) are strongly NP-hard problems. Finally, two new (pseudo)metrics on the space of discrete probability distributions are introduced, based on the so-called variation of information quantity, and NP-hardness of their computation is shown.
Keywords :
computational complexity; information theory; minimisation; NP-hardness; Shannon information; computational intractability; convex regions; discrete probability distributions; entropy minimization; fixed marginals; information quantity; mutual information; optimization problems; related problems; Entropy; Joints; Minimization; Mutual information; Random variables; Transportation; Entropy minimization; NP-complete; NP-hard; maximization of mutual information; number partitioning; partition; pseudometric; subset sum; transportation polytope; variation of information;