Title :
Dithered GMD Transform Coding
Author :
Weng, Ching-Chih ; Vaidyanathan, P.P. ; Su, Han-I
Author_Institution :
California Inst. of Technol., Pasadena, CA, USA
fDate :
5/1/2010 12:00:00 AM
Abstract :
The geometric mean decomposition (GMD) transform coder (TC) was recently introduced and was shown to achieve the optimal coding gain without bit loading under the high bit rate assumption. However, the performance of the GMD transform coder is degraded in the low rate case. There are mainly two reasons for this degradation. First, the high bit rate quantizer model becomes invalid. Second, the quantization error is no longer negligible in the prediction process when the bit rate is low. In this letter, we introduce dithered quantization to tackle the first difficulty, and then redesign the precoders and predictors in the GMD transform coders to tackle the second. We propose two dithered GMD transform coders: the GMD subtractive dithered transform coder (GMD-SD) where the decoder has access to the dither information and the GMD nonsubtractive dithered transform coder (GMD-NSD) where the decoder has no knowledge about the dither. Under the uniform bit loading scheme in scalar quantizers, it is shown that the proposed dithered GMD transform coders perform significantly better than the original GMD coder in the low rate case.
Keywords :
encoding; quantisation (signal); transforms; GMD nonsubtractive dithered transform coder; GMD subtractive dithered transform coder; GMD transform coder; dithered GMD transform coding; dithered quantization; geometric mean decomposition; high bit rate assumption; high bit rate quantizer model; optimal coding gain; prediction process; quantization error; Bit allocation; dithered quantization; geometric mean decomposition; linear prediction; transform coding;
Journal_Title :
Signal Processing Letters, IEEE
DOI :
10.1109/LSP.2010.2043887