Author_Institution :
Dept. of Electr. Eng., Technion - Israel Inst. of Technol., Haifa, Israel
Abstract :
We examine the classical joint source-channel coding problem from the viewpoint of statistical physics and demonstrate that in the random coding regime, the posterior probability distribution of the source given the channel output is dominated by source sequences, which exhibit a behavior that is highly parallel to that of thermal equilibrium between two systems of particles that exchange energy, where one system corresponds to the source and the other corresponds to the channel. The thermodynamical entropies of the dual physical problem are analogous to conditional and unconditional Shannon entropies of the source, and so, their balance in thermal equilibrium yields a simple formula for the mutual information between the source and the channel output, that is induced by the typical code in an ensemble of joint source-channel codes under certain conditions. This formula, as well as the statistical-mechanical perspective that leads to it, form the main contribution of this paper. We also demonstrate how our results can be used in applications, like the wiretap channel, and how can it be extended to multiuser scenarios, like that of the multiple access channel.
Keywords :
combined source-channel coding; entropy; random codes; Shannon entropy; joint source channel coding; mutual information; posterior probability distribution; random coding; statistical physics; thermal equilibrium; Channel coding; Entropy; Information theory; Magnetic fields; Magnetic moments; Mutual information; Physics; Probability distribution; Springs; Temperature distribution; Entropy; joint source–channel coding; mutual information; statistical physics; thermal equilibrium;