• DocumentCode
    2234015
  • Title

    Sparse approximation using least squares support vector machines

  • Author

    Suykens, J.A.K. ; Lukas, L. ; Vandewalle, J.

  • Author_Institution
    ESAT, Katholieke Univ., Leuven, Heverlee, Belgium
  • Volume
    2
  • fYear
    2000
  • fDate
    2000
  • Firstpage
    757
  • Abstract
    In least squares support vector machines (LS-SVMs) for function estimation Vapnik´s ε-insensitive loss function has been replaced by a cost function which corresponds to a form of ridge regression. In this way nonlinear function estimation is done by solving a linear set of equations instead of solving a quadratic programming problem. The LS-SVM formulation also involves less tuning parameters. However, a drawback is that sparseness is lost in the LS-SVM case. In this paper we investigate imposing sparseness by pruning support values from the sorted support value spectrum which results from the solution to the linear system
  • Keywords
    least squares approximations; nonlinear functions; radial basis function networks; sparse matrices; statistical analysis; cost function; least squares support vector machines; linear equations set; nonlinear function estimation; ridge regression; sorted support value spectrum; sparse approximation; sparseness; tuning parameters; Cost function; Ear; Equations; Kernel; Least squares approximation; Least squares methods; Linear systems; Quadratic programming; Support vector machine classification; Support vector machines;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Circuits and Systems, 2000. Proceedings. ISCAS 2000 Geneva. The 2000 IEEE International Symposium on
  • Conference_Location
    Geneva
  • Print_ISBN
    0-7803-5482-6
  • Type

    conf

  • DOI
    10.1109/ISCAS.2000.856439
  • Filename
    856439