Title :
Modeling time dependencies in the mixture of experts
Author :
Fancourt, Craig L. ; Principe, Jose C.
Author_Institution :
Dept. of Electr. Eng., Florida Univ., Gainesville, FL, USA
Abstract :
The mixture of experts, as it was originally formulated, is a static algorithm in the sense that the output of the network, and parameter updates during training, are completely independent from one time step to the next. This independence creates difficulties when the model is applied to time series prediction. We address this by adding memory to the mixture of experts. A Gaussian assumption on each expert´s error is replaced by a chi-square distribution on the local (in time) root mean square error. We derive new gradient descent equations, and present a simulation that demonstrates an improvement in the segmentation of a time series over the classical algorithm
Keywords :
approximation theory; forecasting theory; neural nets; prediction theory; probability; time series; chi-square distribution; gradient descent equations; mixture of experts; parameter updates; prediction; root mean square error; time dependencies; Computer networks; Delay lines; Electronic mail; Equations; Jacobian matrices; Laboratories; Neural engineering; Nonlinear filters; Probability density function; Root mean square;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.687224