Title :
Quantization of Log-Likelihood Ratios to Maximize Mutual Information
Author_Institution :
Tech. Univ. Dresden, Dresden
fDate :
4/1/2009 12:00:00 AM
Abstract :
We propose a quantization scheme for log-likelihood ratios which optimizes the trade-off between rate and accuracy in the sense of rate distortion theory: as distortion measure we use mutual information to determine quantization and decision levels maximizing mutual information for a given rate over a Gaussian channel. This approach is slightly superior to the previously proposed idea of applying the Lloyd-Max algorithm to the dasiasoft bitpsila density associated to the L-values. A further data rate reduction can be achieved with entropy coding, because the optimum quantization levels based on mutual information are used with pronounced unequal probabilities.
Keywords :
entropy codes; iterative decoding; Gaussian channel; Lloyd-Max algorithm; entropy coding; i-values; iterative decoding; log-likelihood ratios; mutual information; quantization; soft bits; AWGN; Decoding; Distortion measurement; Entropy coding; Gaussian channels; Mutual information; Quantization; Rate distortion theory; Signal processing; Signal processing algorithms; Entropy coding; iterative decoding; mutual information; quantization; soft bits;
Journal_Title :
Signal Processing Letters, IEEE
DOI :
10.1109/LSP.2009.2014094