DocumentCode :
2932074
Title :
Markov random field models for natural language
Author :
Mark, Kevin E. ; Miller, Michael I. ; Grenander, Ulf
Author_Institution :
Dept. of Electr. Eng., Washington Univ., St. Louis, MO, USA
fYear :
1995
fDate :
17-22 Sep 1995
Firstpage :
392
Abstract :
Markov chain (N-gram) source models for natural language were explored by Shannon and have found wide application in speech recognition systems. However, the underlying linear graph structure is inadequate to express the hierarchical structure of language necessary for encoding syntactic information. Context-free language models which generate tree graphs are a natural way of encoding this information, but lack the modeling of interword dependencies. We consider a hybrid tree/chain graph structure which has the advantage of incorporating lexical dependencies in syntactic representations. Two Markov random field probability measures are derived on these tree/chain graphs from the maximum entropy principle
Keywords :
Markov processes; context-free grammars; graph theory; maximum entropy methods; natural languages; probability; random processes; speech recognition; Markov chain source models; Markov random field models; Markov random field probability measures; context free language models; hierarchical structure; hybrid tree/chain graph structure; interword dependencies modeling; lexical dependencies; linear graph structure; maximum entropy principle; natural language; speech recognition systems; syntactic information encoding; syntactic representations; Constraint theory; Context modeling; Entropy; Frequency; Markov random fields; Natural languages; Probability; Statistics; Stochastic processes; Tree graphs;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
Conference_Location :
Whistler, BC
Print_ISBN :
0-7803-2453-6
Type :
conf
DOI :
10.1109/ISIT.1995.550379
Filename :
550379
Link To Document :
بازگشت