• DocumentCode
    2875663
  • Title

    Support vector machines: a tutorial overview and critical appraisal

  • Author

    Niranjan, Mahesan

  • Author_Institution
    Dept. of Comput. Sci., Sheffield Univ., UK
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    42401
  • Abstract
    Summary form only given. There has been much interest in the use of support vector machines (SVM) as an approach to high performance pattern classification. In the linearly separable case, SVMs attempt to position a class boundary so that the margin from the nearest example is maximised. This criterion can be implemented by solving a quadratic programming problem, and the solution turns out to be one in which the class boundary may be expressed as a linear combination of a subset of the training data (the support vectors). The elegance of the QP formulation, and the relationship between control of complexity in this formulation and Vapnik-Chervonenkis dimensions are seen as prime attractions of the SVM method. A related idea in high performance pattern classification is that of boosting multiple classifiers. The author shows that the standard SVM formulation is not robust to noise and explains the performance of boosting algorithms by reference to receiver operating characteristics curves
  • Keywords
    pattern classification; Vapnik-Chervonenkis dimensions; balanced kernel perceptron; boosting algorithms; pattern classification; quadratic programming; receiver operating characteristics curves; support vector machines;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Applied Statistical Pattern Recognition (Ref. No. 1999/063), IEE Colloquium on
  • Conference_Location
    Brimingham
  • Type

    conf

  • DOI
    10.1049/ic:19990359
  • Filename
    771381