Title of article
Soft nearest prototype classification
Author/Authors
S.، Seo, نويسنده , , M.، Bode, نويسنده , , K.، Obermayer, نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2003
Pages
-38
From page
39
To page
0
Abstract
We propose a new method for the construction of nearest prototype classifiers which is based on a Gaussian mixture ansatz and which can be interpreted as an annealed version of learning vector quantization (LVQ). The algorithm performs a gradient descent on a cost-function minimizing the classification error on the training set. We investigate the properties of the algorithm and assess its performance for several toy data sets and for an optical letter classification task. Results show 1) that annealing in the dispersion parameter of the Gaussian kernels improves classification accuracy; 2) that classification results are better than those obtained with standard learning vector quantization (LVQ 2.1, LVQ 3) for equal numbers of prototypes; and 3) that annealing of the width parameter improved the classification capability. Additionally, the principled approach provides an explanation of a number of features of the (heuristic) LVQ methods.
Keywords
Learning capability , two-hidden-layer feedforward networks (TLFNs) , neural-network modularity , Storage capacity
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
Serial Year
2003
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
Record number
62819
Link To Document