Title :
Competitive learning and winning-weighted competition for optimal vector quantizer design
Author :
Wang, Zhicheng ; Hanson, John V.
Author_Institution :
Dept. of Electr. & Comput. Eng., Waterloo Univ., Ont., Canada
Abstract :
It is essential to build a nonparametric model to estimate a probability density function p(x) in the areas of vector quantization, pattern recognition, control, and many others. A generalization of Kohonen learning, the winning-weighted competitive learning (WWCL), is presented for a better approximation of p(x) and fast learning convergence by introducing the principle of maximum information preservation into the learning. The WWCL is a promising alternative and improvement to the generalized Lloyd algorithm (GLA) which is an iterative descent algorithm with a monotonically decreasing distortion function towards a local minimum. The WWCL is an online algorithm where the codebook is designed while training data is arriving and the reduction of the distortion function is not necessarily monotonic. Experimental results show that the WWCL consistently provides better codebooks than the Kohonen learning and the GLA in distortion or convergence rate
Keywords :
optimisation; probability; self-organising feature maps; unsupervised learning; vector quantisation; Kohonen learning; codebook design; distortion function reduction; fast learning convergence; generalized Lloyd algorithm; maximum information preservation; nonparametric model; online algorithm; optimal vector quantizer design; pattern recognition; probability density function; winning-weighted competitive learning; Algorithm design and analysis; Clustering algorithms; Convergence; Neural networks; Neurons; Probability density function; Rate distortion theory; Signal processing algorithms; Training data; Vector quantization;
Conference_Titel :
Neural Networks for Processing [1993] III. Proceedings of the 1993 IEEE-SP Workshop
Conference_Location :
Linthicum Heights, MD
Print_ISBN :
0-7803-0928-6
DOI :
10.1109/NNSP.1993.471884