Author_Institution :
Brain Sci. Inst., Amari Res. Unit, RIKEN, Wako
Abstract :
In this paper, it is shown how (naive and structured) variational algorithms may be derived from a factor graph by mechanically applying generic message computation rules; in this way, one can bypass error-prone variational calculus. In prior work by Bishop et al., Xing et al., and Geiger, directed and undirected graphical models have been used for this purpose. The factor graph notation amounts to simpler generic variational message computation rules; by means of factor graphs, variational methods can straightforwardly be compared to and combined with various other message-passing inference algorithms, e.g., Kalman filters and smoothers, iterated conditional modes, expectation maximization (EM), gradient methods, and particle filters. Some of those combinations have been explored in the literature, others seem to be new. Generic message computation rules for such combinations are formulated.
Keywords :
expectation-maximisation algorithm; graph theory; inference mechanisms; information theory; message passing; particle filtering (numerical methods); variational techniques; Kalman filters; error-prone variational calculus; expectation maximization; factor graphs; generic variational message computation rules; gradient methods; iterated conditional modes; message-passing inference algorithms; particle filters; variational message passing; Calculus; Gradient methods; Graphical models; History; Inference algorithms; Message passing; Particle filters; Random variables; State estimation; State-space methods;