Title :
Hierarchical mixture of experts and Max-Min propagation neural networks
Author :
Estévez, Pablo A. ; Nakano, Ryohei
Author_Institution :
Dept. of Electr. Eng., Chile Univ., Santiago, Chile
Abstract :
The max-min propagation neural network model is considered as a hierarchical mixture of experts by replacing the max (min) units with softmax functions. The resulting mixture is different from the model of Jordan and Jacobs, but we exploit the similarities between both models to derive a probability model. Learning is treated as a maximum-likelihood problem, in particular we present a gradient ascent algorithm and an expectation-maximization algorithm. Simulation results on the parity problem and the majority problem are reported
Keywords :
learning (artificial intelligence); maximum likelihood estimation; minimax techniques; neural nets; probability; expectation-maximization algorithm; gradient ascent algorithm; hierarchical mixture of experts; learning algorithm; majority problem; max-min propagation neural network; maximum-likelihood; parity problem; probability model; softmax function; Electronic mail; Expectation-maximization algorithms; Jacobian matrices; Laboratories; Least squares approximation; Least squares methods; Multilayer perceptrons; Neural networks; Vectors;
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
DOI :
10.1109/ICNN.1995.488257