DocumentCode :
2407024
Title :
Steepest descent as message passing
Author :
Dauwels, Justin ; Korl, Sascha ; Loeliger, Hans-Andrea
Author_Institution :
Dept. of Inf. Technol. & Electr. Eng., ETH, Zurich, Switzerland
fYear :
2005
fDate :
29 Aug.-1 Sept. 2005
Abstract :
It is shown how steepest descent (or steepest ascent) may be viewed as a message passing algorithm with "local" message update rules. For example, the well-known backpropagation algorithm for the training of feedforward neural networks may be viewed as message passing on a factor graph. The factor graph approach with its emphasis on "local" computations makes it easy to combine steepest descent with other message passing algorithms such as the sum/max-product algorithms, expectation maximization, Kalman filtering/smoothing, and particle filters. As an example, parameter estimation in a state space model is considered. For this example, it is shown how steepest descent can be used for the maximization step in expectation maximization.
Keywords :
Kalman filters; expectation-maximisation algorithm; gradient methods; graph theory; message passing; parameter estimation; signal processing; smoothing methods; state-space methods; Kalman filtering; Kalman smoothing; backpropagation algorithm; expectation maximization; factor graph; feedforward neural networks; max-product algorithms; message passing; parameter estimation; particle filters; state space model; steepest descent; sum product algorithm; Backpropagation algorithms; Feedforward neural networks; Filtering algorithms; Kalman filters; Message passing; Neural networks; Parameter estimation; Particle filters; Smoothing methods; State-space methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Workshop, 2005 IEEE
Print_ISBN :
0-7803-9480-1
Type :
conf
DOI :
10.1109/ITW.2005.1531853
Filename :
1531853
Link To Document :
بازگشت