• DocumentCode
    113879
  • Title

    On the robustness and generalization of Cauchy regression

  • Author

    Tongliang Liu ; Dacheng Tao

  • Author_Institution
    Fac. of Eng. & Inf. Technol., Univ. of Technol., Sydney, NSW, Australia
  • fYear
    2014
  • fDate
    26-28 April 2014
  • Firstpage
    100
  • Lastpage
    105
  • Abstract
    It was recently highlighted in a special issue of Nature [1] that the value of big data has yet to be effectively exploited for innovation, competition and productivity. To realize the full potential of big data, big learning algorithms need to be developed to keep pace with the continuous creation, storage and sharing of data. Least squares (LS) and least absolute deviation (LAD) have been successful regression tools used in business, government and society over the past few decades. However, these existing technologies are severely limited by noisy data because their breakdown points are both zero, i.e., they do not tolerate outliers. By appropriately setting the turning constant of Cauchy regression (CR), the maximum possible value (50%) of the breakdown point can be attained. CR therefore has the capability to learn a robust model from noisy big data. Although the theoretical analysis of the breakdown point for CR has been comprehensively investigated, we propose a new approach by interpreting the optimization of an objective function as a sample-weighted procedure. We therefore clearly show the differences of the robustness between LS, LAD and CR. We also study the statistical performance of CR. This study derives the generalization error bounds for CR by analyzing the covering number and Rademacher complexity of the hypothesis class, as well as showing how the scale parameter affects its performance.
  • Keywords
    computational complexity; generalisation (artificial intelligence); learning (artificial intelligence); regression analysis; sampling methods; Big Data; CR; Cauchy regression; LAD; LS; Rademacher complexity; big learning algorithms; covering number; data creation; data sharing; data storage; generalization error bounds; hypothesis class; least absolute deviation; least squares; objective function; sample-weighted procedure; scale parameter; statistical performance; Big data; Complexity theory; Electric breakdown; Optimization; Robustness; Training; Upper bound; Cauchy regression; Rademacher complexity; covering number; generalization error bound; robustness;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Science and Technology (ICIST), 2014 4th IEEE International Conference on
  • Conference_Location
    Shenzhen
  • Type

    conf

  • DOI
    10.1109/ICIST.2014.6920341
  • Filename
    6920341