DocumentCode :
1527534
Title :
Marginalized Neural Network Mixtures for Large-Scale Regression
Author :
Lázaro-Gredilla, Miguel ; Figueiras-Vidal, Aníbal R.
Author_Institution :
Dept. of Signal Process. & Commun., Univ. Carlos III de Madrid, Leganes, Spain
Volume :
21
Issue :
8
fYear :
2010
Firstpage :
1345
Lastpage :
1351
Abstract :
For regression tasks, traditional neural networks (NNs) have been superseded by Gaussian processes, which provide probabilistic predictions (input-dependent error bars), improved accuracy, and virtually no overfitting. Due to their high computational cost, in scenarios with massive data sets, one has to resort to sparse Gaussian processes, which strive to achieve similar performance with much smaller computational effort. In this context, we introduce a mixture of NNs with marginalized output weights that can both provide probabilistic predictions and improve on the performance of sparse Gaussian processes, at the same computational cost. The effectiveness of this approach is shown experimentally on some representative large data sets.
Keywords :
Gaussian processes; neural nets; regression analysis; large-scale regression; marginalized neural network mixtures; marginalized output weights; probabilistic predictions; sparse Gaussian processes; Bars; Computational efficiency; Costs; Gaussian processes; High performance computing; Large-scale systems; Multilayer perceptrons; Neural networks; Testing; Uncertainty; Bayesian models; gaussian processes; large data sets; multilayer perceptrons; regression; Animals; Computer Simulation; Data Interpretation, Statistical; Data Mining; Humans; Models, Statistical; Neural Networks (Computer); Normal Distribution; Regression Analysis;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2010.2049859
Filename :
5499041
Link To Document :
بازگشت