Title :
Note On Mutual Information and Orthogonal Space-Time Codes
Author :
Bresler, Guy ; Hajek, Bruce
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Illinois at Urbana-Champaign, Urbana, IL
Abstract :
Bit-error probability and mutual information rate have both been used as performance criteria for space-time codes for wireless communication. We use mutual information as the performance criterion because it determines the possible rate of communication when using an outer code. In this context, linear dispersion codes, first proposed by Hassibi and Hochwald, are appealing because of the high mutual information they provide, as well as their simplicity. Because complexity increases with the number of symbols, it may be sensible in some settings to fix the number of symbols sent per data bit. In the dissertation of Y. Jiang, it was conjectured that among linear dispersion codes with independent, binary symbols, orthogonal space-time codes are optimal in the following sense: they maximize mutual information subject to an average power constraint on each symbol. We prove the conjecture for a fixed number of real symbols with arbitrary distributions
Keywords :
binary codes; error statistics; linear codes; radio networks; space-time codes; arbitrary distributions; bit-error probability; independent binary symbols; linear dispersion codes; mutual information rate; orthogonal space-time codes; performance criterion; power constraint; wireless communication; Context; Decoding; Error probability; MIMO; Mutual information; Random variables; Space time codes; Symmetric matrices; Transmitters; Wireless communication;
Conference_Titel :
Information Theory, 2006 IEEE International Symposium on
Conference_Location :
Seattle, WA
Print_ISBN :
1-4244-0505-X
Electronic_ISBN :
1-4244-0504-1
DOI :
10.1109/ISIT.2006.262039