DocumentCode :
1685882
Title :
Weighted combination of neural network ensembles
Author :
Wanas, Nayer M. ; Kamel, Mohamed S.
Author_Institution :
Dept. of Syst. Design Eng., Waterloo Univ., Ont., Canada
Volume :
2
fYear :
2002
fDate :
6/24/1905 12:00:00 AM
Firstpage :
1748
Lastpage :
1752
Abstract :
There exist numerous schemes and methods to determine the output of an ensemble of classifiers. The most common approach being the majority vote. Furthermore, we might expect that an improvement can be achieved if there is a method by which we may weigh the members of the ensemble according to their individual performance. The feature based approach presented an architecture that tries to approach this target. However, if there is a way that the final classification may influence these weights we should expect an increased performance in the overall classification task. In this paper we present a new training algorithm that utilizes a feedback mechanism to iteratively improve the classification capability of the feature based approach. This approach is compared with the standard training method as well as standard aggregation schemes for combining classifier ensembles. Empirical results show that this architecture improved on classification accuracy
Keywords :
learning (artificial intelligence); neural nets; aggregation schemes; feature based approach; feedback mechanism; majority vote; neural network ensembles; training algorithm; weighted combination; Design engineering; Detectors; Iterative algorithms; Jacobian matrices; Machine intelligence; Neural networks; Pattern analysis; System analysis and design; Systems engineering and theory; Voting;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
ISSN :
1098-7576
Print_ISBN :
0-7803-7278-6
Type :
conf
DOI :
10.1109/IJCNN.2002.1007782
Filename :
1007782
Link To Document :
بازگشت