Title :
Incorporating local word relationships into probabilistic topic models
Author :
Marziea Rahimi;Morteza Zahedi;Hoda Mashayekhi
Author_Institution :
School of IT and computer engineering, Shahrood University of Technology, Semnan, Iran
fDate :
5/1/2015 12:00:00 AM
Abstract :
Probabilistic topic models have been very popular in automatic text analysis since introduction. As a dimensionality reduction method, they are similar to term clustering methods. These models work based on word co-occurrence but are not very flexible with context in which co-occurrence is defined. Probabilistic topic models do not let us to bring local or spatial data into account and therefore their performance is poor when it comes to short documents or applications that are bound to local data. Despite their generally better performance compared to term clustering methods, probabilistic topic models do not benefit from one of the key features of term clustering methods; flexibility in defining context in which co-occurrence is calculated. In this paper we introduce a perspective to look at probabilistic topic models which can lead to more flexible models and a model which according to the perspective has the mentioned flexibility.
Keywords :
"Manganese","Hidden Markov models","Probabilistic logic","Context modeling","Context","Computational modeling","Data models"
Conference_Titel :
Information and Knowledge Technology (IKT), 2015 7th Conference on
Print_ISBN :
978-1-4673-7483-5
DOI :
10.1109/IKT.2015.7288758