Title :
Dynamic Hyperparameter Scaling Method for LVQ Algorithms
Author :
Seo, Sambu ; Obermayer, Klaus
Author_Institution :
Berlin Univ. of Technol., Berlin
Abstract :
We propose a new annealing method for the hyperparameters of several recent learning vector quantization algorithms. We first analyze the relationship between values assigned to the hyperparameters, the on-line learning process, and the structure of the resulting classifier. Motivated by the results we then suggest an annealing method, where each hyperparameter is initially set to a large value and is then slowly decreased during learning. We apply the annealing method to the LVQ 2.1, SLVQ-LR, and RSLVQ methods, and we compare the generalization performance achieved with the new annealing method and with a standard hyperparameter selection using 10-fold cross validation. Benchmark results are provided for the datasets letter and pendigits from the UCI machine learning repository. The new selection method provides equally good or - for some data sets - even superior results when compared to standard selection methods. More importantly, however, the number of learning trials for different values of the hyperparameters is drastically reduced. The results are insensitive to the form and parameters of the annealing schedule.
Keywords :
learning (artificial intelligence); pattern classification; vector quantisation; annealing method; dynamic hyperparameter scaling method; learning vector quantization; machine learning repository; online learning process; Annealing; Computational efficiency; Data analysis; Gene expression; Genetics; Machine learning; Prototypes; Speech analysis; Speech recognition; Vector quantization;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.247304