• DocumentCode
    3767555
  • Title

    Topic2Vec: Learning distributed representations of topics

  • Author

    Liqiang Niu; Xinyu Dai; Jianbing Zhang; Jiajun Chen

  • Author_Institution
    Natural Language Processing Research Group, Department of Computer Science and Technology, National Key Laboratory for Novel Software Technology, Nanjing University, 210023, China
  • fYear
    2015
  • Firstpage
    193
  • Lastpage
    196
  • Abstract
    Latent Dirichlet Allocation (LDA) mining thematic structure of documents plays an important role in nature language processing and machine learning areas. However, the probability distribution from LDA only describes the statistical relationship of occurrences in the corpus and usually in practice, probability is not the best choice for feature representations. Recently, embedding methods have been proposed to represent words and documents by learning essential concepts and representations, such as Word2Vec and Doc2Vec. The embedded representations have shown more effectiveness than LDA-style representations in many tasks. In this paper, we propose the Topic2Vec approach which can learn topic representations in the same semantic vector space with words, as an alternative to probability distribution. The experimental results show that Topic2Vec achieves interesting and meaningful results.
  • Keywords
    "Drugs","Resource management","Artificial neural networks","Natural languages"
  • Publisher
    ieee
  • Conference_Titel
    Asian Language Processing (IALP), 2015 International Conference on
  • Print_ISBN
    978-1-4673-9595-3
  • Type

    conf

  • DOI
    10.1109/IALP.2015.7451564
  • Filename
    7451564