• DocumentCode
    729772
  • Title

    Joint Latent Dirichlet Allocation for non-iid social tags

  • Author

    Jiangchao Yao ; Ya Zhang ; Zhe Xu ; Jun Sun ; Jun Zhou ; Xiao Gu

  • Author_Institution
    Shanghai Key Lab. of Digital Media Process. & Transmissions, Shanghai Jiao Tong Univ., Shanghai, China
  • fYear
    2015
  • fDate
    June 29 2015-July 3 2015
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    Topic models have been widely used for analyzing text corpora and achieved great success in applications including content organization and information retrieval. However, different from traditional text data, social tags in the web containers are usually of small amounts, unordered, and non-iid, i.e., it is highly dependent on contextual information such as users and objects. Considering the specific characteristics of social tags, we here introduce a new model named Joint Latent Dirichlet Allocation (JLDA) to capture the relationships among users, objects, and tags. The model assumes that the latent topics of users and those of objects jointly influence the generation of tags. The latent distributions is then inferred with Gibbs sampling. Experiments on two social tag data sets have demonstrated that the model achieves a lower predictive error and generates more reasonable topics. We also present an interesting application of this model to object recommendation.
  • Keywords
    information retrieval; learning (artificial intelligence); recommender systems; sampling methods; Gibbs sampling; JLDA model; Web containers; content organization; contextual information; information retrieval; joint latent Dirichlet allocation; non-IID social tags; object recommendation; social tags; text corpora analysis; topic models; Adsorption; Analytical models; Art; Meteorology; Pollution; Surface emitting lasers; Surface treatment; JLDA; Topic model; non-iid learning; social tags;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Multimedia and Expo (ICME), 2015 IEEE International Conference on
  • Conference_Location
    Turin
  • Type

    conf

  • DOI
    10.1109/ICME.2015.7177490
  • Filename
    7177490