DocumentCode :
925149
Title :
Huffman codes and self-information
Author :
Katona, Gyula O H ; Nemetz, Tibor O H
Volume :
22
Issue :
3
fYear :
1976
fDate :
5/1/1976 12:00:00 AM
Firstpage :
337
Lastpage :
340
Abstract :
In this paper the connection between the self-information of a source letter from a finite alphabet and its code-word length in a Huffman code is investigated. Consider the set of all independent finite alphabet sources which contain a source letter a of probability p . The maximum over this set of the length of a Huffman codeword for a is determined. This maximum remains constant as p varies between the reciprocal values of two consecutive Fibonacci numbers. For the small p this maximum is approximately equal to \\left[ \\log _{2} frac{1+ \\sqrt {5}}{2} \\right]^{-1} \\approx 1.44 times the self-information.
Keywords :
Huffman codes; Codes; Convergence; Data compression; Distortion measurement; Entropy; Information theory; Mathematics; Probability; Rate-distortion; Topology;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1976.1055554
Filename :
1055554
Link To Document :
بازگشت