DocumentCode :
2746710
Title :
New boosting methods of Gaussian processes for regression
Author :
Song, Yangqiu ; Zhang, Changshui
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing, China
Volume :
2
fYear :
2005
fDate :
31 July-4 Aug. 2005
Firstpage :
1142
Abstract :
Feedforward neural networks are popular tools for nonlinear regression and classification problems. Gaussian Process (GP) can be viewed as an RBF neural network which have infinite number of hidden neurons. On regression problems, they can predict both the mean value and the variance of the given sample. Boosting is one of the most important recent developments in machine learning. Classification problems have dominated research on boosting to date. On the other hand, the application of boosting of regression has received less investigation. In this paper, we develop two boosting methods of GPs for regression according to the characteristic of them. We compare the performance of our ensembles with other boosting algorithms and find that our methods are more stable and essentially have less over-fitting problems than the other methods.
Keywords :
Gaussian processes; learning (artificial intelligence); pattern classification; radial basis function networks; regression analysis; Gaussian processes; RBF neural network; boosting methods; classification problems; feedforward neural networks; hidden neurons; machine learning; nonlinear regression; Automation; Boosting; Feedforward neural networks; Feeds; Gaussian processes; Intelligent systems; Laboratories; Machine learning; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
Type :
conf
DOI :
10.1109/IJCNN.2005.1556014
Filename :
1556014
Link To Document :
بازگشت