DocumentCode :
2058318
Title :
Communication and distributional complexity of joint probability mass functions
Author :
Jaggi, Sidharth ; Effros, Michelle
Author_Institution :
Dept. of Electr. Eng., Caltech, Pasadena, CA
fYear :
2004
fDate :
2004
Firstpage :
362
Lastpage :
362
Abstract :
The problem of truly-lossless (Pe=0) distributed source coding requires knowledge of the joint statistics of the sources. In particular the locations of the zeroes of the probability mass functions (pmfs) are crucial for encoding at rates below (H(X),H(Y)). We consider the distributed computation of the empirical joint pmf Pn of a sequence of random variable pairs observed at physically separated nodes of a network. We consider both worst-case and average measures of information exchange and treat both exact calculation of Pn and a notion of approximation. We find that in all cases the communication cost grows linearly with the size of the input. Further, we consider the problem of determining whether the empirical pmf has a zero in a particular location and show that in most cases considered this also requires a communication cost that is linear in the input size
Keywords :
communication complexity; probability; source coding; statistics; communication complexity; joint statistics; lossless distributed source coding; probability mass functions; Computer networks; Costs; Distributed computing; Encoding; Physics computing; Probability; Protocols; Random variables; Source coding; Statistical distributions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2004. ISIT 2004. Proceedings. International Symposium on
Conference_Location :
Chicago, IL
Print_ISBN :
0-7803-8280-3
Type :
conf
DOI :
10.1109/ISIT.2004.1365399
Filename :
1365399
Link To Document :
بازگشت