DocumentCode :
1551405
Title :
Moderating the outputs of support vector machine classifiers
Author :
Kwok, James Tin-Yau
Author_Institution :
Dept. of Comput. Sci., Hong Kong Baptist Univ., Kowloon Tong, Hong Kong
Volume :
10
Issue :
5
fYear :
1999
fDate :
9/1/1999 12:00:00 AM
Firstpage :
1018
Lastpage :
1031
Abstract :
In this paper, we extend the use of moderated outputs to the support vector machine (SVM) by making use of a relationship between SVM and the evidence framework. The moderated output is more in line with the Bayesian idea that the posterior weight distribution should be taken into account upon prediction, and it also alleviates the usual tendency of assigning overly high confidence to the estimated class memberships of the test patterns. Moreover, the moderated output derived here can be taken as an approximation to the posterior class probability. Hence, meaningful rejection thresholds can be assigned and outputs from several networks can be directly compared. Experimental results on both artificial and real-world data are also discussed
Keywords :
Bayes methods; case-based reasoning; feedforward neural nets; learning (artificial intelligence); pattern classification; Bayesian idea; SVM; estimated class memberships; evidence framework; meaningful rejection thresholds; output moderation; posterior class probability; posterior weight distribution; support vector machine classifiers; Bayesian methods; Learning systems; Machine learning; Neural networks; Risk management; Support vector machine classification; Support vector machines; Testing; Uncertainty; Upper bound;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.788642
Filename :
788642
Link To Document :
بازگشت