DocumentCode :
2248
Title :
Approximating Gaussian Mixture Model or Radial Basis Function Network With Multilayer Perceptron
Author :
Patrikar, A.M.
Author_Institution :
Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA
Volume :
24
Issue :
7
fYear :
2013
fDate :
Jul-13
Firstpage :
1161
Lastpage :
1166
Abstract :
Gaussian mixture models (GMMs) and multilayer perceptron (MLP) are both popular pattern classification techniques. This brief shows that a multilayer perceptron with quadratic inputs (MLPQ) can accurately approximate GMMs with diagonal covariance matrices. The mapping equations between the parameters of GMM and the weights of MLPQ are presented. A similar approach is applied to radial basis function networks (RBFNs) to show that RBFNs with Gaussian basis functions and Euclidean norm can be approximated accurately with MLPQ. The mapping equations between RBFN and MLPQ weights are presented. There are well-established training procedures for GMMs, such as the expectation maximization (EM) algorithm. The GMM parameters obtained by the EM algorithm can be used to generate a set of initial weights of MLPQ. Similarly, a trained RBFN can be used to generate a set of initial weights of MLPQ. MLPQ training can be continued further with gradient-descent based methods, which can lead to improvement in performance compared to the GMM or RBFN from which it is initialized. Thus, the MLPQ can always perform as well as or better than the GMM or RBFN.
Keywords :
Gaussian processes; expectation-maximisation algorithm; learning (artificial intelligence); multilayer perceptrons; pattern classification; radial basis function networks; EM algorithm; Euclidean norm; GMM; Gaussian basis functions; Gaussian mixture model approximation; MLPQ training; MLPQ weights; RBFN weights; diagonal covariance matrices; expectation maximization algorithm; gradient-descent based methods; mapping equations; multilayer perceptron with quadratic inputs; pattern classification techniques; radial basis function networks; training procedures; Artificial neural networks; hidden Markov models (HMMs); multilayer perceptron (MLP); radial basis function networks (RBFNs);
fLanguage :
English
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2162-237X
Type :
jour
DOI :
10.1109/TNNLS.2013.2249086
Filename :
6490412
Link To Document :
بازگشت