Abstract :
Transform-coded images exhibit distortions that fall outside of the assumptions of traditional denoising techniques. In this paper, we use tools from robust signal processing to construct linear, worst-case estimators for the denoising of transform compressed images. We show that while standard denoising is fundamentally determined by statistical models for images alone, the distortions induced by transform coding are heavily dependent on the structure of the transform used. Our method, thus, uses simple models for the image and for the quantization error, with the latter capturing the transform dependency. Based on these models, we derive optimal, linear estimators of the original image that are optimal in the mean-squared error sense for the worst-case cross correlation between the original and the quantization error. Our construction is transform agnostic and is applicable to transforms from block discrete cosine transforms to wavelets. Furthermore, our approach is applicable to different types of image statistics and can also serve as an optimization tool for the design of transforms/quantizers. Through the interaction of the source and quantizer models, our work provides useful insights and is instrumental in identifying and removing quantization artifacts from general signals coded with general transforms. As we decouple the modeling and processing steps, we allow for the construction of many different types of estimators depending on the desired sophistication and available computational complexity. In the low end of this spectrum, our lookup table based estimator, which can be deployed in low complexity environments, provides competitive PSNR values with some of the best results in the literature
Keywords :
computational complexity; data compression; discrete cosine transforms; image coding; image denoising; mean square error methods; optimisation; quantisation (signal); statistical analysis; transform coding; wavelet transforms; PSNR values; block discrete cosine transforms; computational complexity; image statistics; linear worst-case estimators; lookup table based estimator; mean-squared error; optimal linear estimators; optimization tool; quantization artifacts; quantization error; quantization noise; quantizer models; robust signal processing; statistical models; transform coded image denoising; transform compressed image; wavelet transforms; worst-case cross correlation; Discrete cosine transforms; Discrete transforms; Discrete wavelet transforms; Image coding; Noise reduction; Noise robustness; Quantization; Signal processing; Statistics; Transform coding; Artifacts; cross correlation; deblocking; denoising; postprocessing; quantization; robust; transform coding; worst case;