DocumentCode
802005
Title
Variational Bayes for generalized autoregressive models
Author
Roberts, Stephen J. ; Penny, Will D.
Author_Institution
Robotics Res. Group, Oxford Univ., UK
Volume
50
Issue
9
fYear
2002
fDate
9/1/2002 12:00:00 AM
Firstpage
2245
Lastpage
2257
Abstract
We describe a variational Bayes (VB) learning algorithm for generalized autoregressive (GAR) models. The noise is modeled as a mixture of Gaussians rather than the usual single Gaussian. This allows different data points to be associated with different noise levels and effectively provides robust estimation of AR coefficients. The VB framework is used to prevent overfitting and provides model-order selection criteria both for AR order and noise model order. We show that for the special case of Gaussian noise and uninformative priors on the noise and weight precisions, the VB framework reduces to the Bayesian evidence framework. The algorithm is applied to synthetic and real data with encouraging results.
Keywords
Bayes methods; Gaussian noise; autoregressive processes; electroencephalography; learning (artificial intelligence); medical signal processing; signal sampling; variational techniques; AR coefficients; AR order; Bayesian evidence; EEG data; Gaussian noise; generalized autoregressive models; mixture of Gaussians; model-order selection criteria; noise model order; noise precision; real data; robust estimation; sampling; synthetic data; uninformative priors; variational Bayes learning algorithm; weight precision; Bayesian methods; Cost function; Gaussian noise; Helium; History; Inference algorithms; Least squares methods; Noise level; Noise reduction; Noise robustness;
fLanguage
English
Journal_Title
Signal Processing, IEEE Transactions on
Publisher
ieee
ISSN
1053-587X
Type
jour
DOI
10.1109/TSP.2002.801921
Filename
1025587
Link To Document