DocumentCode :
1553215
Title :
Sequential prediction and ranking in universal context modeling and data compression
Author :
Weinberger, Marcelo J. ; Seroussi, Gadiel
Author_Institution :
Hewlett-Packard Co., Palo Alto, CA, USA
Volume :
43
Issue :
5
fYear :
1997
fDate :
9/1/1997 12:00:00 AM
Firstpage :
1697
Lastpage :
1706
Abstract :
Most state-of-the-art lossless image compression schemes use prediction followed by some form of context modeling. This might seem redundant at first, as the contextual information used for prediction is also available for building the compression model, and a universal coder will eventually learn the “predictive” patterns of the data. In this correspondence, we provide a format justification to the combination of these two modeling tools, by showing that a combined scheme may result in faster convergence rate to the source entropy. This is achieved via a reduction in the model cost of universal coding. In deriving the main result, we develop the concept of sequential ranking, which can be seen as a generalization of sequential prediction, and we study its combinatorial and probabilistic properties
Keywords :
combinatorial mathematics; convergence; entropy; image coding; prediction theory; sequences; source coding; combinatorial properties; convergence rate; data compression; format justification; lossless image compression schemes; modeling tools; predictive patterns; probabilistic properties; sequential prediction; sequential ranking; source entropy; universal coder; universal coding; universal context modeling; Context modeling; Convergence; Costs; Data compression; Decorrelation; Entropy; Image coding; Prediction algorithms; Predictive models; Probability distribution;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.623176
Filename :
623176
Link To Document :
بازگشت