• DocumentCode
    1829430
  • Title

    Sampling Adaptively Using the Massart Inequality for Scalable Learning

  • Author

    Jianhua Chen ; Jian Xu

  • Author_Institution
    Sch. of Electr. Eng. & Comput. Sci., Louisiana State Univ., Baton Rouge, LA, USA
  • Volume
    2
  • fYear
    2013
  • fDate
    4-7 Dec. 2013
  • Firstpage
    362
  • Lastpage
    367
  • Abstract
    With the advent of the "big data" era, the data mining community is facing an increasingly critical problem of developing scalable algorithms capable of mining knowledge from massive amount of data. This paper develops a sampling-based method to address the issue of scalability. We show how to utilize the new, adaptive sampling method in [4] to develop a scalable learning algorithm by boosting, an ensemble learning method. We present experimental results using bench-mark data sets from the UC-Irvine ML data repository that confirm the much improved efficiency and thus scalability, and competitive prediction accuracy of the new adaptive boosting method, in comparison with existing approaches.
  • Keywords
    data mining; learning (artificial intelligence); sampling methods; Massart inequality; UC-Irvine ML data repository; adaptive sampling method; benchmark data sets; boosting; competitive prediction accuracy; data mining; ensemble learning method; scalable learning algorithm; Accuracy; Boosting; Computer science; Data mining; Prediction algorithms; Sampling methods; Scalability; Adaptive Sampling; Boosting; Ensemble Learning; Sample Size; Scalable Learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Machine Learning and Applications (ICMLA), 2013 12th International Conference on
  • Conference_Location
    Miami, FL
  • Type

    conf

  • DOI
    10.1109/ICMLA.2013.149
  • Filename
    6786136