It is known that the expected codeword length

of the best uniquely decodable (UD) code satisfies

. Let

be a random variable which can take on n values. Then it is shown that the average codeword length

for the best one-to-one (not necessarily uniquely decodable) code for

is shorter than the average codeword length

for the best uniquely decodable code by no more than

. Let

be a random variable taking on a finite or countable number of values and having entropy

. Then it is proved that

. Some relations are established among the Kolmogorov, Chaitin, and extension complexities. Finally it is shown that, for all computable probability distributions, the universal prefix codes associated with the conditional Chaitin complexity have expected codeword length within a constant of the Shannon entropy.