DocumentCode
2497832
Title
Evidence-based mixture of MLP-experts
Author
Masoudnia, Saeed ; Rostami, Mohammad ; Tabassian, Mahdi ; Sajedin, Atena ; Ebrahimpour, Reza
Author_Institution
Math., Stat. & Comput. Sci. Dept., Univ. of Tehran, Tehran, Iran
fYear
2010
fDate
18-23 July 2010
Firstpage
1
Lastpage
7
Abstract
Mixture of Experts (ME) is a modular neural network architecture for supervised learning. In this paper, we propose an evidence-based ME to deal with the classification problem. In the basic form of ME the problem space is automatically divided into several subspaces for the experts and the outputs of experts are combined by a gating network. Satisfactory performance of the basic ME depends on the diversity among experts. In conventional ME, different initialization of experts and supervision of the gating network during the learning procedure, provide the diversity. The main idea of our proposed method is to employ the Dempster-Shafer (D-S) theory of evidence to improve determination of learning parameters (which results more diverse experts) and the way of combining experts´ decisions. Experimental results with some data sets from UCI repository show that our proposed method yields better classification rates as compared to basic ME and static combining of neural network based on D-S theory.
Keywords
expert systems; inference mechanisms; learning (artificial intelligence); multilayer perceptrons; neural nets; Dempster-Shafer theory; MLP-experts; UCI repository; evidence-based mixture; gating network; mixture of experts; modular neural network architecture; supervised learning; Face; Facsimile; Glass; Sonar; Vehicles;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location
Barcelona
ISSN
1098-7576
Print_ISBN
978-1-4244-6916-1
Type
conf
DOI
10.1109/IJCNN.2010.5596928
Filename
5596928
Link To Document