DocumentCode :
2020914
Title :
Recurrent input transformations for hidden Markov models
Author :
Valtchev, V. ; Kapadia, S. ; Young, S.J.
Author_Institution :
Eng. Dept., Cambridge Univ., UK
Volume :
2
fYear :
1993
fDate :
27-30 April 1993
Firstpage :
287
Abstract :
A novel architecture which integrates recurrent input transformation (RITs) and continuous density hidden Markov models (HMMs) is presented. The basic HMM structure is extended to accommodate recurrent neural networks which transform the input observations before they enter the Gaussian output distributions associated with the states of the HMM. During training the parameters of both the HMM and the RIT are simultaneously optimized according to the maximum mutual information (MMI) criterion. Results for the E-set recognition task are presented, demonstrating the ability of RITs to exploit longer-term correlations in the speech signal and to give improved discrimination.<>
Keywords :
correlation methods; hidden Markov models; learning (artificial intelligence); recurrent neural nets; speech recognition; E-set recognition; Gaussian output distributions; HMM structure; architecture; continuous density hidden Markov models; longer-term correlations; maximum mutual information; recurrent input transformation; recurrent neural networks; speech discrimination; training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1993. ICASSP-93., 1993 IEEE International Conference on
Conference_Location :
Minneapolis, MN, USA
ISSN :
1520-6149
Print_ISBN :
0-7803-7402-9
Type :
conf
DOI :
10.1109/ICASSP.1993.319292
Filename :
319292
Link To Document :
بازگشت