DocumentCode
579050
Title
The sum-capacity of discrete-noise multiple-access channels with single-user decoding and identical codebooks
Author
Abou-Faycal, Ibrahim ; Rustom, Elie
Author_Institution
Dept. of Electr. & Comput. Eng., American Univ. of Beirut, Beirut, Lebanon
fYear
2012
fDate
10-15 June 2012
Firstpage
2223
Lastpage
2227
Abstract
We consider an additive multiple-access channel model where all users are constrained to use identical codebooks, and where single-user decoding is performed at the receiver. We study the sum-capacity of the channel for an arbitrarily large, but finite, number of users. For a noiseless n-user channel, we construct a signaling scheme that achieves rates per user that are arbitrarily large, proving that the sum-capacity is infinite, whether the users are average and/or peak power limited or not. We show that this result still holds whenever an arbitrary discrete-noise component is added, provided there exists a positive lower bound on the separation between noise samples. Whenever the noise is of bounded support, the non power-constrained sum-capacity is also proven to be infinite. The results are valid for an asynchronous multiple-access channel with single-user decoding, as the appropriate channel model is identical to the one studied in this work.
Keywords
decoding; multi-access systems; radio receivers; wireless channels; additive multiple-access channel model; arbitrary discrete-noise component; asynchronous multiple-access channel; discrete-noise multiple-access channels; identical codebooks; noise samples; noiseless n-user channel; nonpower-constrained sum-capacity; signaling scheme; single-user decoding; Additives; Channel models; Decoding; Manganese; Multiaccess communication; Noise; Receivers;
fLanguage
English
Publisher
ieee
Conference_Titel
Communications (ICC), 2012 IEEE International Conference on
Conference_Location
Ottawa, ON
ISSN
1550-3607
Print_ISBN
978-1-4577-2052-9
Electronic_ISBN
1550-3607
Type
conf
DOI
10.1109/ICC.2012.6364430
Filename
6364430
Link To Document