DocumentCode :
313998
Title :
Entropy coded successively refinable uniform threshold quantizers
Author :
Brunk, Hugh ; Jafarkhani, Hamid ; Farvardin, Nariman
Author_Institution :
Dept. of Electr. Eng., Maryland Univ., College Park, MD, USA
fYear :
1997
fDate :
29 Jun-4 Jul 1997
Firstpage :
58
Abstract :
We examine the performance of entropy coded successively refinable uniform threshold quantizers, which have been utilized in numerous proposed progressive image coders. We view a successively refinable quantizer with N stages of refinement as consisting of a sequence of partitions {Pn}, and a sequence of codebooks {Cn}, 1⩽n⩽N. We denote the nth reconstruction of an input sample x as xn; it can be obtained using the nth partition and nth codebook and a simple quantization rule. We consider the design of entropy-coded successively refinable scalar quantizers in which the finest (highest rate) partition and corresponding codebook comprise a uniform threshold quantizer (UTQ). All codebooks are designed optimally for the corresponding partitions and it is well known that entropy coded UTQs perform within 0.255 bits/sample of the rate distortion bound for a variety of source distributions
Keywords :
entropy codes; image coding; image reconstruction; image sampling; rate distortion theory; codebooks; entropy coded quantizers; highest rate partition; input sample reconstruction; partitions sequence; performance; progressive image coders; quantization rule; rate distortion bound; source distributions; successively refinable uniform threshold quantizers; Chromium; Educational institutions; Entropy; Gaussian distribution; Image reconstruction; Rate-distortion; Streaming media; Time sharing computer systems; Tree data structures;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory. 1997. Proceedings., 1997 IEEE International Symposium on
Conference_Location :
Ulm
Print_ISBN :
0-7803-3956-8
Type :
conf
DOI :
10.1109/ISIT.1997.612973
Filename :
612973
Link To Document :
بازگشت