DocumentCode :
928598
Title :
Some equivalences between Shannon entropy and Kolmogorov complexity
Author :
Leung-Yan-Cheong, Sik K. ; Cover, Thomas M.
Volume :
24
Issue :
3
fYear :
1978
fDate :
5/1/1978 12:00:00 AM
Firstpage :
331
Lastpage :
338
Abstract :
It is known that the expected codeword length L_{UD} of the best uniquely decodable (UD) code satisfies H(X)\\leqL_{UD}< H(X)+1 . Let X be a random variable which can take on n values. Then it is shown that the average codeword length L_{1:1} for the best one-to-one (not necessarily uniquely decodable) code for X is shorter than the average codeword length L_{UD} for the best uniquely decodable code by no more than (\\log _{2} \\log _{2} n)+ 3 . Let Y be a random variable taking on a finite or countable number of values and having entropy H . Then it is proved that L_{1:1}\\geq H-\\log _{2}(H + 1)-\\log _{2}\\log _{2}(H + 1 )-\\cdots -6 . Some relations are established among the Kolmogorov, Chaitin, and extension complexities. Finally it is shown that, for all computable probability distributions, the universal prefix codes associated with the conditional Chaitin complexity have expected codeword length within a constant of the Shannon entropy.
Keywords :
Coding; Entropy functions; Concatenated codes; Decoding; Distributed computing; Encoding; Entropy; Information theory; Probability distribution; Random variables; Statistics;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1978.1055891
Filename :
1055891
Link To Document :
بازگشت