DocumentCode :
1749207
Title :
Neural learning using AdaBoost
Author :
Murphey, Yi L. ; Chen, Zhihang ; Guo, Hong
Author_Institution :
Dept. of Electr. & Comput. Eng., Michigan Univ., Dearborn, MI, USA
Volume :
2
fYear :
2001
fDate :
2001
Firstpage :
1037
Abstract :
This paper describes a committee of neural networks submitted to the IJCNN 2001 generalization ability challenge (GAC) competition and a number of implementation issues with a focus on the generalization problem. The committee of neural networks was generated using the well-known AdaBoost, which is a general method for improving the performance of any learning algorithm that consistently generates classifiers that perform better than random guessing. We also discuss the feature selection and various experiments conducted in the hope of finding a neural network architecture that can generalize correctly in the blind test data used in the GAC competition
Keywords :
adaptive systems; backpropagation; feature extraction; generalisation (artificial intelligence); neural nets; AdaBoost; IJCNN 2001 competition; adaptive boosting; backpropagation; feature selection; generalization ability challenge; learning algorithm; neural networks; Backpropagation algorithms; Boosting; Decision trees; Distribution functions; Information processing; Machine learning algorithms; Neural networks; Neurons; System testing; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939503
Filename :
939503
Link To Document :
بازگشت