DocumentCode
1246483
Title
Probability estimation in arithmetic and adaptive-Huffman entropy coders
Author
Duttweiler, Donald L. ; Chamzas, Christodoulos
Author_Institution
AT&T Bell Labs., Holmdel, NJ, USA
Volume
4
Issue
3
fYear
1995
fDate
3/1/1995 12:00:00 AM
Firstpage
237
Lastpage
246
Abstract
Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards
Keywords
Huffman codes; adaptive codes; adaptive estimation; arithmetic codes; code standards; entropy codes; image coding; probability; JBIG image coding standard; JPEG image coding standard; adaptive-Huffman entropy coders; arithmetic coders; binary alphabet; coding efficiency; compression; dubbed scaled-count; estimation quality; hardware; probability estimation; random variable; scaled-count adaptive probability estimator; software; statistics; symbol coding; Arithmetic; Decoding; Entropy coding; Hardware; Image coding; Probability; Random variables; Standards development; Statistics; Transform coding;
fLanguage
English
Journal_Title
Image Processing, IEEE Transactions on
Publisher
ieee
ISSN
1057-7149
Type
jour
DOI
10.1109/83.366473
Filename
366473
Link To Document