DocumentCode :
2890160
Title :
Combining hidden Markov model and neural network classifiers
Author :
Niles, Les T. ; Silverman, Harvey F.
Author_Institution :
Div. of Eng., Brown Univ., Providence, RI, USA
fYear :
1990
fDate :
3-6 Apr 1990
Firstpage :
417
Abstract :
An architecture for a neural network that implements a hidden Markov model (HMM) is presented. This HMM net suggests integrating signal preprocessing (such as vector quantization) with the classifier. A minimum mean-squared-error training criterion for the HMM/neural net is presented and compared to maximum-likelihood and maximum-mutual-information criteria. The HMM forward-backward algorithm is shown to be the same as the neural net backpropagation algorithm. The implications of probability constraints on the HMM parameters are discussed. Relaxing these constraints allows negative probabilities, equivalent to inhibitory connections. A probabilistic interpretation is given for a network with negative, and even complex-valued, parameters
Keywords :
Markov processes; neural nets; forward-backward algorithm; hidden Markov model; inhibitory connections; integrating signal preprocessing; maximum-mutual-information criteria; minimum mean-squared-error training criterion; negative probabilities; neural net backpropagation algorithm; neural network classifiers; probability constraints; vector quantization; Backpropagation algorithms; Hidden Markov models; Information analysis; Neural networks; Pattern analysis; Probability; Recurrent neural networks; Speech analysis; Speech recognition; Training data; Vector quantization; Virtual manufacturing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1990. ICASSP-90., 1990 International Conference on
Conference_Location :
Albuquerque, NM
ISSN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.1990.115724
Filename :
115724
Link To Document :
بازگشت