DocumentCode :
1175566
Title :
Quantization of Log-Likelihood Ratios to Maximize Mutual Information
Author :
Rave, Wolfgang
Author_Institution :
Tech. Univ. Dresden, Dresden
Volume :
16
Issue :
4
fYear :
2009
fDate :
4/1/2009 12:00:00 AM
Firstpage :
283
Lastpage :
286
Abstract :
We propose a quantization scheme for log-likelihood ratios which optimizes the trade-off between rate and accuracy in the sense of rate distortion theory: as distortion measure we use mutual information to determine quantization and decision levels maximizing mutual information for a given rate over a Gaussian channel. This approach is slightly superior to the previously proposed idea of applying the Lloyd-Max algorithm to the dasiasoft bitpsila density associated to the L-values. A further data rate reduction can be achieved with entropy coding, because the optimum quantization levels based on mutual information are used with pronounced unequal probabilities.
Keywords :
entropy codes; iterative decoding; Gaussian channel; Lloyd-Max algorithm; entropy coding; i-values; iterative decoding; log-likelihood ratios; mutual information; quantization; soft bits; AWGN; Decoding; Distortion measurement; Entropy coding; Gaussian channels; Mutual information; Quantization; Rate distortion theory; Signal processing; Signal processing algorithms; Entropy coding; iterative decoding; mutual information; quantization; soft bits;
fLanguage :
English
Journal_Title :
Signal Processing Letters, IEEE
Publisher :
ieee
ISSN :
1070-9908
Type :
jour
DOI :
10.1109/LSP.2009.2014094
Filename :
4787271
Link To Document :
بازگشت