In this paper the connection between the self-information of a source letter from a finite alphabet and its code-word length in a Huffman code is investigated. Consider the set of all independent finite alphabet sources which contain a source letter a of probability

. The maximum over this set of the length of a Huffman codeword for a is determined. This maximum remains constant as

varies between the reciprocal values of two consecutive Fibonacci numbers. For the small

this maximum is approximately equal to
![\\left[ \\log _{2} frac{1+ \\sqrt {5}}{2} \\right]^{-1} \\approx 1.44](/images/tex/7030.gif)
times the self-information.