DocumentCode :
457200
Title :
Competitive Mixtures of Simple Neurons
Author :
Sridharan, Karthik ; Beal, Matthew J. ; Govindaraju, Venu
Author_Institution :
Dept. of Comput. Sci. & Eng., State Univ. of New York at Buffalo, NY
Volume :
2
fYear :
0
fDate :
0-0 0
Firstpage :
494
Lastpage :
497
Abstract :
We propose a competitive finite mixture of neurons (or perceptrons) for solving binary classification problems. Our classifier includes a prior for the weights between different neurons such that it prefers mixture models made up from neurons having classification boundaries as orthogonal to each other as possible. We derive an EM algorithm for learning the mixing proportions and weights of each neuron, consisting of an exact E step and a partial M step, and show that our model covers the regions of high posterior probability in weight space and tends to reduce overfitting. We demonstrate the way in which our mixture classifier works using a toy 2D data set, showing the effective use of strategically positioned components in the mixture. We further compare its performance against SVMs and one-hidden-layer neural networks on four real-world data sets from the UCI repository, and show that even a relatively small number of neurons with appropriate competitive priors can achieve superior classification accuracies on held-out test data
Keywords :
expectation-maximisation algorithm; pattern classification; perceptrons; probability; EM algorithm; binary classification problem; high posterior probability; perceptrons; simple neurons; Computational modeling; Computer science; Cost function; Logistics; Neural networks; Neurons; Space stations; Testing; Training data; Venus;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2006. ICPR 2006. 18th International Conference on
Conference_Location :
Hong Kong
ISSN :
1051-4651
Print_ISBN :
0-7695-2521-0
Type :
conf
DOI :
10.1109/ICPR.2006.394
Filename :
1699251
Link To Document :
بازگشت