DocumentCode :
285149
Title :
Word recognition with recurrent network automata
Author :
Albesano, Dario ; Gemello, Roberto ; Mana, Franco
Author_Institution :
CSELT, Torino, Italy
Volume :
2
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
308
Abstract :
The authors report a method to directly encode temporal information into a neural network by explicitly modeling that information with a left-to-right automaton, and teaching a recurrent network to identify the automaton states. The state length and position are adjusted with the usual train and re-segment iterative procedure. The global model is a hybrid of a recurrent neural network which implements the state transition models, and dynamic programming, which finds the best state sequence. The advantages achieved by using recurrent networks are outlined by applying the method to a speaker-independent digit recognition task
Keywords :
automata theory; dynamic programming; recurrent neural nets; speech recognition; dynamic programming; global model; left-to-right automaton; neural network; recurrent network automata; speaker-independent digit recognition task; state transition models; temporal information encoding; word recognition; Automatic speech recognition; Context modeling; Dynamic programming; Hidden Markov models; Learning automata; Neural networks; Predictive models; RNA; Speech recognition; Telecommunications;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.226970
Filename :
226970
Link To Document :
بازگشت