• DocumentCode
    1559331
  • Title

    The index entropy of a mismatched codebook

  • Author

    Zamir, Ram

  • Author_Institution
    Dept. of Electr. Eng.-Syst., Tel Aviv Univ., Israel
  • Volume
    48
  • Issue
    2
  • fYear
    2002
  • fDate
    2/1/2002 12:00:00 AM
  • Firstpage
    523
  • Lastpage
    528
  • Abstract
    Entropy coding is a well-known technique to reduce the rate of a quantizer. It plays a particularly important role in universal quantization, where the quantizer codebook is not matched to the source statistics. We investigate the gain due to entropy coding by considering the entropy of the index of the first codeword, in a mismatched random codebook, that D-matches the source word. We show that the index entropy is strictly lower than the "uncoded" rate of the code, provided that the entropy is conditioned on the codebook. The number of bits saved by conditional entropy coding is equal to the divergence between the "favorite type" (the limiting empirical distribution of the first D-matching codeword) and the codebook-generating distribution. Specific examples are provided
  • Keywords
    entropy codes; random codes; source coding; codebook-generating distribution; conditional entropy coding; entropy coding; favorite type distribution; gain; index entropy; mismatched codebook; mismatched random codebook; quantizer codebook; source statistics; source word; universal quantization; Distortion measurement; Entropy coding; Information theory; Lattices; Quantization; Source coding; Statistics;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.979328
  • Filename
    979328