• DocumentCode
    2709445
  • Title

    A kernel hat matrix based rejection criterion for outlier removal in support vector regression

  • Author

    Dufrenois, Franck ; Noyer, Jean Charles

  • Author_Institution
    Lab. d´´Analyse des Syst. du Littoral, Univ. of Calais, Calais, France
  • fYear
    2009
  • fDate
    14-19 June 2009
  • Firstpage
    736
  • Lastpage
    743
  • Abstract
    In this paper, we propose a kernel hat matrix based learning stage for outlier removal. In particular, we show that the Gaussian kernel hat matrix have very interesting discriminative properties under the condition of choosing appropriate values for kernel parameters. Thus, we develop a practical model selection criteria in order to well separate the ldquooutlierrdquo distribution from the ldquodominantrdquo distribution. This learning stage, beforehand applied to the training data set, offers a new answer for down-weighting outliers corrupting both the response and predictor variables in regression tasks. The application to simulated and real data shows the robustness of the proposed approach.
  • Keywords
    Gaussian processes; data mining; learning (artificial intelligence); regression analysis; support vector machines; Gaussian kernel hat matrix; discriminative property; dominant distribution; down-weighting outliers; kernel parameter; learning stage; model selection criteria; outlier distribution; outlier removal; rejection criterion; support vector regression; Covariance matrix; Kernel; Least squares methods; Linear regression; Neural networks; Predictive models; Regression analysis; Robustness; Training data; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2009. IJCNN 2009. International Joint Conference on
  • Conference_Location
    Atlanta, GA
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-3548-7
  • Electronic_ISBN
    1098-7576
  • Type

    conf

  • DOI
    10.1109/IJCNN.2009.5178778
  • Filename
    5178778