• DocumentCode
    1472166
  • Title

    Redundancy-Related Bounds for Generalized Huffman Codes

  • Author

    Baer, Michael B.

  • Author_Institution
    Vista Res., Monterey, CA, USA
  • Volume
    57
  • Issue
    4
  • fYear
    2011
  • fDate
    4/1/2011 12:00:00 AM
  • Firstpage
    2278
  • Lastpage
    2290
  • Abstract
    This paper presents new lower and upper bounds for the compression rate of binary prefix codes optimized over memoryless sources according to various nonlinear codeword length objectives. Like the most well-known redundancy bounds for minimum average redundancy coding-Huffman coding-these are in terms of a form of entropy and/or the probability of an input symbol, often the most probable one. The bounds here, some of which are tight, improve on known bounds of the form L ∈ [H,H+1), where H is some form of entropy in bits (or, in the case of redundancy objectives, 0) and L is the length objective, also in bits. The objectives explored here include exponential-average length, maximum pointwise redundancy, and exponential-average pointwise redundancy (also called dth exponential redundancy). The first of these relates to various problems involving queueing, uncertainty, and lossless communications; the second relates to problems involving Shannon coding and universal modeling. Also explored here for these two objectives is the related matter of individual codeword length.
  • Keywords
    Huffman codes; binary codes; entropy codes; information theory; memoryless systems; nonlinear codes; probability; queueing theory; redundancy; Huffman coding; Shannon coding; binary prefix codes; compression rate; entropy; exponential redundancy; exponential-average length; exponential-average pointwise redundancy; generalized Huffman codes; individual codeword length; lossless communications; maximum pointwise redundancy; memoryless sources; minimum average redundancy coding; nonlinear codeword length objectives; probability; queueing; redundancy bounds; redundancy objectives; redundancy-related bounds; universal modeling; Channel coding; Entropy; Minimization; Probability distribution; Redundancy; Upper bound; Huffman codes; Rényi entropy; Shannon codes; optimal prefix code; queueing; worst case minimax redundancy;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2011.2110670
  • Filename
    5730556