Title :
Vector quantization with model selection
Author_Institution :
Information Syst. Lab., Stanford Univ., CA, USA
Abstract :
We propose an iterative algorithm that incorporates model selection into entropy-constrained vector quantization. Two model selection steps are added to the classic Lloyd algorithm as additional necessary conditions for optimality. Codewords are pruned by using a Lagrangian with entropy and codebook size constraints. Relevant features are found by using a partitioned vector quantization. Relevant and irrelevant features are modelled independently. Moreover, we model irrelevant features by a global probability density function to make them independent of partition cells. This enables us to avoid a problem in comparing the performances of vector quantizers in different dimensional spaces. As a Lagrangian decreases, we not only obtain a locally optimal codebook, but also reduce codebook size and identify relevant features.
Keywords :
entropy; iterative methods; vector quantisation; Lagrangian; classic Lloyd algorithm; codebook size constraints; codewords; entropy-constrained vector quantization; global probability density function; iterative algorithm; model selection; partitioned vector quantization; Clustering algorithms; Feature extraction; Gaussian processes; Iterative algorithms; Lagrangian functions; Partitioning algorithms; Probability density function; Signal processing algorithms; Supervised learning; Vector quantization;
Conference_Titel :
Data Compression Conference, 2006. DCC 2006. Proceedings
Print_ISBN :
0-7695-2545-8
DOI :
10.1109/DCC.2006.82