• DocumentCode
    1749267
  • Title

    Comparison between support vector algorithm and algebraic perceptron

  • Author

    Hanselmann, Thomas ; Noakes, Lyle

  • Author_Institution
    Dept. of Electr. & Electron. Eng., Western Australia Univ., Nedlands, WA, Australia
  • Volume
    2
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    1459
  • Abstract
    Introduces the idea of applying the perceptron learning algorithm to high-dimensional linear vector spaces with a scalar product. A linear separation is sought in the high-dimensional space that corresponds to a polynomial separation in the low-dimensional input space. This is similar to the polynomial support vector machines (SVMs) but in contrast to those a non-optimal solution will be found in general. A comparison with SVMs is done with binary images as training data
  • Keywords
    generalisation (artificial intelligence); learning automata; perceptrons; algebraic perceptron; binary images; high-dimensional linear vector spaces; linear separation; perceptron learning algorithm; polynomial separation; scalar product; support vector algorithm; Arithmetic; Information processing; Intelligent systems; Kernel; Mathematics; Pattern recognition; Polynomials; Statistics; Support vector machines; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.939577
  • Filename
    939577