• DocumentCode
    445899
  • Title

    Yet faster method to optimize SVR hyperparameters based on minimizing cross-validation error

  • Author

    Kobayashi, Kenji ; Kitakoshi, Daisuke ; Nakano, Ryohei

  • Author_Institution
    Nagoya Inst. of Technol., Japan
  • Volume
    2
  • fYear
    2005
  • fDate
    31 July-4 Aug. 2005
  • Firstpage
    871
  • Abstract
    The performance of support vector (SV) regression deeply depends on its hyperparameters such as an insensitive zone thickness, a penalty factor, kernel function parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters λ so that a cross-validation error is minimized. The method iterates two steps until convergence; step 1 optimizes parameters θ under given λ, while step 2 improves λ under given θ. Recently a faster version called the MCV-SVR-light was proposed, which accelerates step 2 by pruning. The present paper yet accelerates step 1 of the MCV-SVR-light by pruning without affecting solution quality. Here the pruning means confining the process to support vectors. Our experiments using three data sets show that the proposed method converged faster than the existing methods while the generalization performance remained comparable.
  • Keywords
    regression analysis; support vector machines; MCV-SVR-light; cross-validation error; insensitive zone thickness; support vector regression; Acceleration; Convergence; Kernel; Lagrangian functions; Neural networks; Optimization methods; Quadratic programming; Training data; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-9048-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.2005.1555967
  • Filename
    1555967