• DocumentCode
    3746243
  • Title

    Multiple source domain adaptation: A sharper bound using weighted Rademacher complexity

  • Author

    Jianwei Liu;Jiajia Zhou;Xionglin Luo

  • Author_Institution
    Department of Automation, China University of Petroleum Beijing, China
  • fYear
    2015
  • Firstpage
    546
  • Lastpage
    553
  • Abstract
    Traditional supervised learning algorithms assume that the training data and the test data are drawn from the same probability distribution. Considering the assumption is not hold in modern applications of machine learning, domain adaptation problems are proposed when the data distribution in test domain is different from that in training domain. This paper studies the problem of domain adaptation with multiple sources, which has also received considerable attention in many areas. In this paper, we introduce a novelty complexity (weighted Rademacher complexity) to restrict the complexity of a hypothesis class in multiple source domain adaptation. We explore its self bounding properties and give new generalization bounds for multiple source domain adaptation. The results clearly show the benefits with respect to previous bounds for multiple source domain adaptation. We perform extensive experiments over well-known Amazon reviews benchmark data sets and the results demonstrate the well performance of our bound for the multiple source domain adaptation.
  • Publisher
    ieee
  • Conference_Titel
    Technologies and Applications of Artificial Intelligence (TAAI), 2015 Conference on
  • Electronic_ISBN
    2376-6824
  • Type

    conf

  • DOI
    10.1109/TAAI.2015.7407124
  • Filename
    7407124