Title :
A new unsupervised learning algorithm for multilayer perceptrons based on information theory principles
Author_Institution :
NTT Human Interface Lab., Tokyo, Japan
Abstract :
The author describes a novel learning algorithm for multilayer perceptrons (MLPs). The trained MLPs are used as the vector quantizer (VQ) in a hidden Markov model (HMM) based speech recognition system. This approach represents an unsupervised learning algorithm for multilayer perceptrons, i.e., the neurons of the output layer do not receive any specific target values during training, but instead the output is learned during training using principles of self-organization. Information theory principles are used as learning criteria for the MLP. When using VQ in a HMM-based speech recognition system, multiple features such as cepstral parameters, differential cepstral parameters, and energy can be used as joint input into the same VQ, thus avoiding the use of multiple codebooks. In this case, the principle of `sensor fusion´ can be transferred to the speech recognition area with same intention, namely using neural networks for merging the output of different information sources in order to obtain an improved feature extractor for more robust pattern recognition
Keywords :
Markov processes; encoding; information theory; learning systems; neural nets; speech recognition; differential cepstral parameters; energy; feature extractor; hidden Markov model based speech recognition system; information theory; multilayer perceptrons; neural networks; pattern recognition; self-organization; sensor fusion; unsupervised learning algorithm; vector quantizer; Cepstral analysis; Hidden Markov models; Information theory; Merging; Multilayer perceptrons; Neural networks; Neurons; Sensor fusion; Speech recognition; Unsupervised learning;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170683