Title :
Combining support vector machine learning with the discrete cosine transform in image compression
Author :
Robinson, Jonathan ; Kecman, Vojislav
Author_Institution :
Sch. of Eng., Univ. of Auckland, New Zealand
fDate :
7/1/2003 12:00:00 AM
Abstract :
We present an algorithm for the application of support vector machine (SVM) learning to image compression. The algorithm combines SVMs with the discrete cosine transform (DCT). Unlike a classic radial basis function networks or multilayer perceptrons that require the topology of the network to be defined before training, an SVM selects the minimum number of training points, called support vectors, that ensure modeling of the data within the given level of accuracy (a.k.a. insensitivity zone ε). It is this property that is exploited as the basis for an image compression algorithm. Here, the SVMs learning algorithm performs the compression in a spectral domain of DCT coefficients, i.e., the SVM approximates the DCT coefficients. The parameters of the SVM are stored in order to recover the image. Results demonstrate that even though there is an extra lossy step compared with the baseline JPEG algorithm, the new algorithm dramatically increases compression for a given image quality; conversely it increases image quality for a given compression ratio. The approach presented can be readily applied for other modeling schemes that are in a form of a sum of weighted basis functions.
Keywords :
data compression; discrete cosine transforms; image coding; learning (artificial intelligence); learning automata; compression ratio; discrete cosine transform; image compression; image quality; insensitivity zone; kernel machines; level of accuracy; lossy step; support vector machine learning; training points; Discrete cosine transforms; Image coding; Image quality; Machine learning; Machine learning algorithms; Multilayer perceptrons; Network topology; Radial basis function networks; Support vector machines; Transform coding;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2003.813842