Title :
Information Theoretic Vector Quantization with Fixed Point Updates
Author :
Rao, Sudhir ; Han, Seungju ; Principe, José
Author_Institution :
Florida Univ., Gainesville
Abstract :
In this paper, we revisit information theoretic vector quantization (ITVQ) algorithm introduced in (T. Lehn-Schioler et al., 2005) and make it practical. We derive a fixed point update rule to minimize the Cauchy-Schwartz(CS) pdf divergence between the set of codewords and the actual data. In doing so, we overcome two severe deficiencies of the previous gradient based method namely, the number of parameters to be optimized and slow convergence rate, thus making this algorithm more efficient and useful as a compression algorithm.
Keywords :
convergence; gradient methods; higher order statistics; optimisation; vector quantisation; convergence; data compression; fixed point update rule; gradient based method; higher order statistics; information theoretic vector quantization; optimisation; Annealing; Convergence; Entropy; Gaussian processes; Kernel; Neural networks; Neurons; Optimization methods; Self organizing feature maps; Vector quantization;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371098