DocumentCode
755641
Title
On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions
Author
Chen, Ting-Li ; Geman, Stuart
Author_Institution
Inst. of Stat. Sci., Acad. Sinica, Taipei
Volume
54
Issue
7
fYear
2008
fDate
7/1/2008 12:00:00 AM
Firstpage
3166
Lastpage
3174
Abstract
Progressive encoding of a signal generally involves an estimation step, designed to reduce the entropy of the residual of an observation over the entropy of the observation itself. Oftentimes the conditional distributions of an observation, given already-encoded observations, are well fit within a class of symmetric and unimodal distributions (e.g., the two-sided geometric distributions in images of natural scenes, or symmetric Paretian distributions in models of financial data). It is common practice to choose an estimator that centers, or aligns, the modes of the conditional distributions, since it is common sense that this will minimize the entropy, and hence the coding cost of the residuals. But with the exception of a special case, there has been no rigorous proof. Here we prove that the entropy of an arbitrary mixture of symmetric and unimodal distributions is minimized by aligning the modes. The result generalizes to unimodal and rotation-invariant distributions in Rn. We illustrate the result through some experiments with natural images.
Keywords
data compression; entropy codes; image coding; natural scenes; entropy coding; lossless image compression; natural image; predictive coding; rotation-invariant distribution; signal encoding; symmetric distribution; unimodal distribution; Costs; Entropy; Fluctuations; Image coding; Layout; Predictive coding; Random variables; Security; Signal design; Solid modeling; Entropy coding; LOCO; lossless image compression; mixture distributions; predictive coding; symmetric distributions; unimodal distributions;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.2008.924686
Filename
4544954
Link To Document