DocumentCode :
2087979
Title :
On the entropy region of discrete and continuous random variables and network information theory
Author :
Shadbakht, Sormeh ; Hassibi, Babak
Author_Institution :
Electr. Eng. Dept., California Inst. of Technol., Pasadena, CA
fYear :
2008
fDate :
26-29 Oct. 2008
Firstpage :
2130
Lastpage :
2134
Abstract :
We show that a large class of network information theory problems can be cast as convex optimization over the convex space of entropy vectors. A vector in 2n - 1 dimensional space is called entropic if each of its entries can be regarded as the joint entropy of a particular subset of n random variables (note that any set of size n has 2n - 1 nonempty subsets.) While an explicit characterization of the space of entropy vectors is well-known for n = 2, 3 random variables, it is unknown for n Gt 3 (which is why most network information theory problems are open.) We will construct inner bounds to the space of entropic vectors using tools such as quasi-uniform distributions, lattices, and Cayley´s hyperdeterminant.
Keywords :
entropy; vectors; continuous random variables; convex optimization; discrete variables; entropy region; entropy vectors; network information theory; Communication networks; Contracts; Entropy; Information theory; Interference channels; Lattices; Random variables; Relays; Space technology;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signals, Systems and Computers, 2008 42nd Asilomar Conference on
Conference_Location :
Pacific Grove, CA
ISSN :
1058-6393
Print_ISBN :
978-1-4244-2940-0
Electronic_ISBN :
1058-6393
Type :
conf
DOI :
10.1109/ACSSC.2008.5074810
Filename :
5074810
Link To Document :
بازگشت