DocumentCode :
1544241
Title :
Mixture of experts regression modeling by deterministic annealing
Author :
Rao, Ajit V. ; Miller, David ; Rose, Kenneth ; Gersho, Allen
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA, USA
Volume :
45
Issue :
11
fYear :
1997
fDate :
11/1/1997 12:00:00 AM
Firstpage :
2811
Lastpage :
2820
Abstract :
We propose a new learning algorithm for regression modeling. The method is especially suitable for optimizing neural network structures that are amenable to a statistical description as mixture models. These include mixture of experts, hierarchical mixture of experts (HME), and normalized radial basis functions (NRBF). Unlike recent maximum likelihood (ML) approaches, we directly minimize the (squared) regression error. We use the probabilistic framework as means to define an optimization method that avoids many shallow local minima on the complex cost surface. Our method is based on deterministic annealing (DA), where the entropy of the system is gradually reduced, with the expected regression cost (energy) minimized at each entropy level. The corresponding Lagrangian is the system´s “free-energy”, and this annealing process is controlled by variation of the Lagrange multiplier, which acts as a “temperature” parameter. The new method consistently and substantially outperformed the competing methods for training NRBF and HME regression functions over a variety of benchmark regression examples
Keywords :
entropy; error analysis; expert systems; feedforward neural nets; learning (artificial intelligence); optimisation; probability; signal processing; statistical analysis; Lagrange multiplier; deterministic annealing; free energy; information theory; learning algorithm; mixture models; mixture of experts; neural network structure optimization; normalized radial basis functions; optimization method; probabilistic framework; regression cost; regression modeling; signal analysis; squared regression error; statistical description; system entropy; temperature parameter; Annealing; Cost function; Digital signal processing; Entropy; Lagrangian functions; Neural networks; Optimization methods; Probability; Process control; Statistics;
fLanguage :
English
Journal_Title :
Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1053-587X
Type :
jour
DOI :
10.1109/78.650107
Filename :
650107
Link To Document :
بازگشت