• DocumentCode
    445904
  • Title

    Optimizing resources in model selection for support vector machines

  • Author

    Adankon, Mathias M. ; Cheriel, M. ; Ayat, Nedjem E.

  • Author_Institution
    Laboratory for Imagery, Vision, & Artificial Intelligence, Ecole de Technol. Superieure, Montreal, Que., Canada
  • Volume
    2
  • fYear
    2005
  • fDate
    31 July-4 Aug. 2005
  • Firstpage
    925
  • Abstract
    Tuning SVM kernel parameters is a an important step for achieving a high-performing learning machine. The usual automatic methods used to tune these parameters require an inversion of the Gram-Schmidt matrix or a resolution of an extra quadratic programming problem. In the case of a large dataset these methods require the addition of huge amounts of memory and a long CPU time to the already significant resources used in the SVM training. In this paper, we propose a fast method based on an approximation of the gradient of the empirical error along with incremental learning, which reduces the resources required both in terms of processing time and of storage space.
  • Keywords
    learning (artificial intelligence); quadratic programming; support vector machines; very large databases; high-performing learning machine; incremental learning; large dataset; quadratic programming; support vector machines; Artificial intelligence; Error analysis; Kernel; Laboratories; Machine learning; Quadratic programming; Risk management; Support vector machine classification; Support vector machines; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-9048-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.2005.1555976
  • Filename
    1555976