DocumentCode :
3412661
Title :
Cost-sensitive boosting algorithms as gradient descent
Author :
Cai, Qu-Tang ; Song, Yang-Qui ; Zhang, Chang-Shui
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing
fYear :
2008
fDate :
March 31 2008-April 4 2008
Firstpage :
2009
Lastpage :
2012
Abstract :
AdaBoost is a well known boosting method for generating strong ensemble of weak base learners. The procedure of AdaBoost can be fitted in a gradient descent optimization framework, which is important for analyzing and devising its procedure. Cost sensitive boosting (CSB) is an emerging subject extending the boosting methods for cost sensitive classification applications. Most CSB methods are performed by directly modifying the original AdaBoost procedure. Unfortunately, the effectiveness of most cost sensitive boosting methods are checked only by experiments. It remains unclear whether these methods can be viewed as gradient descent procedures like AdaBoost. In this paper, we show that several typical CSB methods can also be view as gradient descent for minimizing a unified objective function. We then deduce a general greedy boosting procedure. Experimental results also validate the effectiveness of the proposed procedure.
Keywords :
gradient methods; pattern recognition; cost-sensitive boosting algorithms; gradient descent optimization; Algorithm design and analysis; Automation; Boosting; Classification algorithms; Costs; Information science; Intelligent systems; Laboratories; Pattern recognition; Boosting; Cost-sensitive Classification; Gradient Descent; Optimization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on
Conference_Location :
Las Vegas, NV
ISSN :
1520-6149
Print_ISBN :
978-1-4244-1483-3
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2008.4518033
Filename :
4518033
Link To Document :
بازگشت