• DocumentCode
    488950
  • Title

    Higher-Order CMAC Neural Networks - Theory and Practice

  • Author

    Lane, Stephen H. ; Handelman, David A. ; Gelfand, Jack J.

  • Author_Institution
    Human Information Processing Group, Department of Psychology, Princeton University, Princeton, NJ 08540; Robicon Systems Inc., 301 N. Harrison St., Suite 242, Princeton, NJ 08540
  • fYear
    1991
  • fDate
    26-28 June 1991
  • Firstpage
    1579
  • Lastpage
    1585
  • Abstract
    CMAC (Cerebellar Model Articulation Controller) neural networks are capable of learning nonlinear functions extremely quickly due to the local nature of the weight updating. The rectangular shape of CMAC receptive field functions, however, produces discontinuous (staircase) function approximations without inherent analytical derivatives. The ability to learn both functions and function derivatives is important for the development of many on-line adaptive filter, estimation, and control algorithms. It is shown that use of B-Spline receptive field functions in conjunction with more general CMAC weight addressing schemes allows higher-order CMAC neural networks to be developed that can learn both functions and function derivatives. This also allows novel hierarchical and multi-layer CMAC network architectures to be constructed that can be trained using standard error back-propagation learning techniques.
  • Keywords
    Adaptive filters; Biological neural networks; Control systems; Function approximation; Lifting equipment; Neural networks; Polynomials; Shape; Spline; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    American Control Conference, 1991
  • Conference_Location
    Boston, MA, USA
  • Print_ISBN
    0-87942-565-2
  • Type

    conf

  • Filename
    4791645