Title :
Informational divergence approximations to product distributions
Author :
Jie Hou ; Kramer, Gerhard
Author_Institution :
Inst. for Commun. Eng., Tech. Univ. Munchen, Munich, Germany
Abstract :
The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verdú on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.
Keywords :
entropy; Han-Verdu resolvability; entropy; informational divergence approximations; mutual information; product distributions; source distribution; Approximation methods; Channel coding; Conferences; Mutual information; Random variables;
Conference_Titel :
Information Theory (CWIT), 2013 13th Canadian Workshop on
Conference_Location :
Toronto, ON
DOI :
10.1109/CWIT.2013.6621596