• Title of article

    A support vector machine formulation to PCA analysis and its kernel version

  • Author/Authors

    J.، Vandewalle, نويسنده , , J.A.K.، Suykens, نويسنده , , T.، Van Gestel, نويسنده , , B.، De Moor, نويسنده ,

  • Issue Information
    روزنامه با شماره پیاپی سال 2003
  • Pages
    -446
  • From page
    447
  • To page
    0
  • Abstract
    In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA is obtained as introduced by Scholkopf et al. While least squares support vector machine classifiers have a natural link with the kernel Fisher discriminant analysis (minimizing the within class scatter around targets +1 and -1), for PCA analysis one can take the interpretation of a one-class modeling problem with zero target value around which one maximizes the variance. The score variables are interpreted as error variables within the problem formulation. In this way primaldual constrained optimization problem interpretations to the linear and kernel PCA analysis are obtained in a similar style as for least square-support vector machine (LS-SVM) classifiers.
  • Keywords
    Learning capability , neural-network modularity , Storage capacity , two-hidden-layer feedforward networks (TLFNs)
  • Journal title
    IEEE TRANSACTIONS ON NEURAL NETWORKS
  • Serial Year
    2003
  • Journal title
    IEEE TRANSACTIONS ON NEURAL NETWORKS
  • Record number

    62826