Title :
Mixing Aveboost and Conserboost to Improve Boosting Methods
Author :
Torres-Sospedra, Joaquín ; Hernández-Espinosa, Carlos ; Fernández-Redondo, Mercedes
Author_Institution :
Univ. Jaume I, Castellon
Abstract :
Adaptive boosting (Adaboost) is one of the most known methods to build an ensemble of neural networks. Adaboost has been studied and successfully improved by some authors like Breiman, Kuncheva or Oza. In this paper we briefly analyze and mix two of the most important variants of Adaboost in order to build a robuster ensemble of neural networks. The boosting methods we have studied are averaged boosting (Aveboost) and conservative boosting (Conserboost). We proposed the mixed method we have called averaged onservative boosting (ACE). In this method we apply the conservative equation used in Conserboost along with the averaged procedure used in Aveboost in order to update the sampling distrubution of Adaboost. We have tested the methods with seven databases from the UCI repository. We have used the mean increase of performance and the mean percentage of error reduction to compare both methods, the results show that the new proposed method performs better.
Keywords :
learning (artificial intelligence); sampling methods; statistical distributions; Adaboost sampling distribution; Aveboost method; Conserboost method; adaptive boosting; averaged conservative boosting; conservative boosting; conservative equation; neural network ensemble; Artificial neural networks; Boosting; Computer networks; Databases; Equations; Neural networks; Nonhomogeneous media; Robustness; Sampling methods; Testing;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371037