Title :
Learning word embeddings from dependency relations
Author :
Yinggong Zhao ; Shujian Huang ; Xinyu Dai ; Jianbing Zhang ; Jiajun Chen
Author_Institution :
State Key Lab. for Novel Software Technol., Nanjing Univ., Nanjing, China
Abstract :
Continuous-space word representation has demonstrated its effectiveness in many natural language pro-cessing(NLP) tasks. The basic idea for embedding training is to update embedding matrix based on its context. However, such context has been constrained on fixed surrounding words, which we believe are not sufficient to represent the actual relations for given center word. In this work we extend previous approach by learning distributed representations from dependency structure of a sentence which can capture long distance relations. Such context can learn better semantics for words, which is proved on Semantic-Syntactic Word Relationship task. Besides, competitive result is also achieved for dependency embeddings on WordSim-353 task.
Keywords :
computational linguistics; knowledge representation; learning (artificial intelligence); matrix algebra; natural language processing; NLP; WordSim-353 task; continuous-space word representation; dependency relations; embedding matrix; natural language processing; semantic-syntactic word relationship; sentence dependency structure; word embedding learning; Accuracy; Context; Context modeling; Correlation; Semantics; Syntactics; Training; dependency; distributed representation; word embedding;
Conference_Titel :
Asian Language Processing (IALP), 2014 International Conference on
Conference_Location :
Kuching
DOI :
10.1109/IALP.2014.6973490