DocumentCode :
788357
Title :
Arbitrary source models and Bayesian codebooks in rate-distortion theory
Author :
Kontoyiannis, Ioannis ; Zhang, Junshan
Author_Institution :
Dept. of Comput. Sci., Brown Univ., Providence, RI, USA
Volume :
48
Issue :
8
fYear :
2002
fDate :
8/1/2002 12:00:00 AM
Firstpage :
2276
Lastpage :
2290
Abstract :
We characterize the best achievable performance of lossy compression algorithms operating on arbitrary random sources, and with respect to general distortion measures. Direct and converse coding theorems are given for variable-rate codes operating at a fixed distortion level, emphasizing: (a) nonasymptotic results, (b) optimal or near-optimal redundancy bounds, and (c) results with probability one. This development is based in part on the observation that there is a precise correspondence between compression algorithms and probability measures on the reproduction alphabet. This is analogous to the Kraft inequality in lossless data compression. In the case of stationary ergodic sources our results reduce to the classical coding theorems. As an application of these general results, we examine the performance of codes based on mixture codebooks for discrete memoryless sources. A mixture codebook (or Bayesian codebook) is a random codebook generated from a mixture over some class of reproduction distributions. We demonstrate the existence of universal mixture codebooks, and show that it is possible to universally encode memoryless sources with redundancy of approximately (d/2) log n bits, where d is the dimension of the simplex of probability distributions on the reproduction alphabet.
Keywords :
Bayes methods; memoryless systems; probability; random codes; rate distortion theory; source coding; variable length codes; variable rate codes; Bayesian codebook; Bayesian codebooks; Kraft inequality; compression algorithms; converse coding theorems; direct coding theorems; discrete memoryless sources; distortion measures; lossless data compression; lossy compression algorithms; mixture codebooks; near-optimal redundancy bounds; nonasymptotic results; optimal redundancy bounds; probability distributions; probability measures; random codebook; random sources; rate-distortion theory; reproduction alphabet; reproduction distributions; source models; stationary ergodic sources; universal mixture codebooks; variable length codes; variable-rate codes; Bayesian methods; Codes; Compression algorithms; Data compression; Decoding; Distortion measurement; Loss measurement; Performance loss; Probability distribution; Rate-distortion;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2002.800493
Filename :
1019839
Link To Document :
بازگشت