• DocumentCode
    916170
  • Title

    Quantizing for maximum output entropy (Corresp.)

  • Author

    Messerschmitt, D.

  • Volume
    17
  • Issue
    5
  • fYear
    1971
  • fDate
    9/1/1971 12:00:00 AM
  • Firstpage
    612
  • Lastpage
    612
  • Abstract
    The entropy at the output of a quantizer is equal to the average mutual information between unquantized and quantized random variables. Thus, for a fixed number of quantization levels, output entropy is a reasonable information-theoretic criterion of quantizer fidelity. It is shown that, for a class of signal distributions, which includes the Gaussian, the quantizers with maximum output entropy (MOE) and minimum average error (MAE) are approximately the same within a multiplicative constant.
  • Keywords
    Entropy functions; Quantization (signal); Signal quantization; Bit rate; Entropy; Laplace equations; Mutual information; Phase change materials; Quantization; Random variables; Statistics; Telephony;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.1971.1054681
  • Filename
    1054681