Title :
Hidden Markov estimation for unrestricted stochastic context-free grammars
Author_Institution :
Xerox Palo Alto Res. Center, CA, USA
Abstract :
A novel algorithm for estimating the parameters of a hidden stochastic context-free grammar is presented. In contrast to the inside/outside (I/O) algorithm it does not require the grammar to be expressed in Chomsky normal form, and thus can operate directly on more natural representations of a grammar. The algorithm uses a trellis-based structure as opposed to the binary branching tree structure used by the I/O algorithm. The form of the trellis is an extension of that used by the forward/backward (F/B) algorithm, and as a result the algorithm reduces to the latter for components that can be modeled as finite-state networks. In the same way that a hidden Markov model (HMM) is a stochastic analog of a finite-state network, the representation used by the algorithm is a stochastic analog of a recursive transition network, in which a state may be simple or itself contain an underlying structure
Keywords :
context-free grammars; hidden Markov models; parameter estimation; speech recognition; stochastic processes; HMM; finite-state networks; hidden Markov estimation; hidden Markov model; parameter estimation; recursive transition network; stochastic analog; stochastic context-free grammars; trellis structure; Context modeling; Hidden Markov models; Production; Robustness; Speech; Stochastic processes; Tree data structures;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1992. ICASSP-92., 1992 IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0532-9
DOI :
10.1109/ICASSP.1992.225943