Title :
Input-output HMMs for sequence processing
Author :
Bengio, Yoshua ; Frasconi, Paolo
Author_Institution :
Dept. of Comput. Sci. & Oper. Res., Montreal Univ., Que., Canada
fDate :
9/1/1996 12:00:00 AM
Abstract :
We consider problems of sequence processing and propose a solution based on a discrete-state model in order to represent past context. We introduce a recurrent connectionist architecture having a modular structure that associates a subnetwork to each state. The model has a statistical interpretation we call input-output hidden Markov model (IOHMM). It can be trained by the estimation-maximization (EM) or generalized EM (GEM) algorithms, considering state trajectories as missing data, which decouples temporal credit assignment and actual parameter estimation. The model presents similarities to hidden Markov models (HMMs), but allows us to map input sequences to output sequences, using the same processing style as recurrent neural networks. IOHMMs are trained using a more discriminant learning paradigm than HMMs, while potentially taking advantage of the EM algorithm. We demonstrate that IOHMMs are well suited for solving grammatical inference problems on a benchmark problem. Experimental results are presented for the seven Tomita grammars, showing that these adaptive models can attain excellent generalization
Keywords :
discrete systems; hidden Markov models; learning (artificial intelligence); parameter estimation; probability; recurrent neural nets; state-space methods; Tomita grammars; discrete-state model; estimation-maximization; input-output hidden Markov model; learning; modular structure; parameter estimation; recurrent connectionist architecture; sequence processing; state trajectories; Backpropagation algorithms; Context modeling; Delay; Hidden Markov models; Inference algorithms; Natural languages; Parameter estimation; Production; Recurrent neural networks; State estimation;
Journal_Title :
Neural Networks, IEEE Transactions on