DocumentCode :
2671459
Title :
Neural network regression with input uncertainty
Author :
Wright, W.A.
Author_Institution :
Sowerby Res. Centre, British Aerosp., Bristol, UK
fYear :
1998
fDate :
31 Aug-2 Sep 1998
Firstpage :
284
Lastpage :
293
Abstract :
It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise or corruption. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which allows for input noise given that some model of the noise process exists. In the limit where this noise process is small and symmetric it is shown, using the Laplace approximation, that there is an additional term to the usual Bayesian error bar which depends on the variance of the input noise process. Further by treating the true (noiseless) input as a hidden variable and sampling this jointly with the network´s weights, using Markov chain Monte-Carlo methods, it is demonstrated that it is possible to infer the unbiased regression over the noiseless input
Keywords :
Bayes methods; Markov processes; Monte Carlo methods; inference mechanisms; neural nets; noise; statistical analysis; Bayesian inference methods; Bayesian neural network; Laplace approximation; Markov chain Monte-Carlo methods; data corruption; input noise; input uncertainty; neural network regression; small symmetric noise process; unbiased regression inference; Additive noise; Bayesian methods; Flexible printed circuits; Gaussian noise; Learning systems; Neural networks; Sampling methods; Sensor phenomena and characterization; Sensor systems; Uncertainty;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Signal Processing VIII, 1998. Proceedings of the 1998 IEEE Signal Processing Society Workshop
Conference_Location :
Cambridge
ISSN :
1089-3555
Print_ISBN :
0-7803-5060-X
Type :
conf
DOI :
10.1109/NNSP.1998.710658
Filename :
710658
Link To Document :
بازگشت