DocumentCode
2952007
Title
Entropy and Memory Constrained Vector Quantization with Separability Based Feature Selection
Author
Yoon, Sangho ; Gray, Robert M.
Author_Institution
Dept. of Electr. Eng., Stanford Univ., CA
fYear
2006
fDate
9-12 July 2006
Firstpage
269
Lastpage
272
Abstract
An iterative model selection algorithm is proposed. The algorithm seeks relevant features and an optimal number of codewords (or codebook size) as part of the optimization. We use a well-known separability measure to perform feature selection, and we use a Lagrangian with entropy and codebook size constraints to find the optimal number of codewords. We add two model selection steps to the quantization process: one for feature selection and the other for choosing the number of clusters. Once relevant and irrelevant features are identified, we also estimate the probability density function of irrelevant features instead of discarding them. This can avoid the bias of problem of the separability measure favoring high dimensional spaces
Keywords
entropy codes; feature extraction; iterative decoding; probability; vector quantisation; codebook; codewords; entropy constrained vector quantization; iterative model selection algorithm; memory constrained vector quantization; probability density function; separability based feature selection; Algorithm design and analysis; Clustering algorithms; Design optimization; Entropy; Gaussian processes; Information systems; Iterative algorithms; Probability density function; Signal processing algorithms; Vector quantization;
fLanguage
English
Publisher
ieee
Conference_Titel
Multimedia and Expo, 2006 IEEE International Conference on
Conference_Location
Toronto, Ont.
Print_ISBN
1-4244-0366-7
Electronic_ISBN
1-4244-0367-7
Type
conf
DOI
10.1109/ICME.2006.262450
Filename
4036588
Link To Document