DocumentCode
876401
Title
Bayesian support vector regression using a unified loss function
Author
Chu, Wei ; Keerthi, S. Sathiya ; Ong, Chong Jin
Author_Institution
Gatsby Comput. Neurosci. Unit, Univ. Coll. London, UK
Volume
15
Issue
1
fYear
2004
Firstpage
29
Lastpage
44
Abstract
In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets.
Keywords
Bayes methods; Gaussian processes; regression analysis; support vector machines; Bayesian framework; Bayesian support vector regression; error bars; likelihood evaluation; maximum a posteriori estimation; model adaptation; nonquadratic loss function; soft insensitive loss function; standard Gaussian processes; unified loss function; Bayesian methods; Biological neural networks; Gaussian processes; Ground penetrating radar; Kernel; Maximum a posteriori estimation; Predictive models; Quadratic programming; Support vector machine classification; Support vector machines; Bayes Theorem; Regression Analysis;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/TNN.2003.820830
Filename
1263576
Link To Document