DocumentCode
352964
Title
Bagging down-weights leverage points
Author
Grandvalet, Yves
Author_Institution
Univ. de Technol. de Compiegne, France
Volume
4
fYear
2000
fDate
2000
Firstpage
505
Abstract
Bagging is a procedure averaging estimators trained on bootstrap samples. Numerous experiments have shown that bagged estimates often yield better results than the original predictor, and several explanations have been given to account for this gain. However, six years from its introduction, bagging is still not fully understood. Most explanations given until now are based on global properties of the estimates. Here, we focus on the local effects on leverage points, i.e., on observations whose fitted values are largely determined by the corresponding response values. These points are shown experimentally to be down-weighted by bagging. The performance of the bagged estimate depends on the goodness of these points for the original estimator. Illustrative examples findings are supported by the study of smoothing matrix, and their consequences are discussed
Keywords
prediction theory; statistical analysis; bagged estimates; bootstrap samples; down-weighted; leverage points; predictor; Bagging; Boosting; Classification tree analysis; Prediction methods; Smoothing methods; Testing; Yield estimation;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location
Como
ISSN
1098-7576
Print_ISBN
0-7695-0619-4
Type
conf
DOI
10.1109/IJCNN.2000.860821
Filename
860821
Link To Document