• DocumentCode
    288380
  • Title

    A training approach based on linear separability analysis for layered perceptrons

  • Author

    Zhang, D. ; Kamel, M. ; Elmasry, M.I.

  • Author_Institution
    VLSI Res. Group, Waterloo Univ., Ont., Canada
  • Volume
    1
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    517
  • Abstract
    In this paper, we explore linear separability as a training approach for layered perceptrons. A training approach, called layer adaptation (LA), is presented. Its learning mechanism and implementation are described and examples are given to illustrate its effectiveness. Compared with the BP and the MRII algorithms, preliminary analysis shows that the LA is easily implemented using digital VLSI technology while the stability, the training time and the complexity in silicon are acceptable
  • Keywords
    learning (artificial intelligence); multilayer perceptrons; neural nets; digital layered perceptrons; layer adaptation; learning mechanism; linear separability analysis; linear separable binary function; training time; Algorithm design and analysis; Artificial neural networks; Design engineering; Learning systems; Logic functions; Neurons; Stability analysis; System analysis and design; Systems engineering and theory; Very large scale integration;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374217
  • Filename
    374217