• DocumentCode
    3242020
  • Title

    Generalising the nodes of the error propagation network

  • Author

    Robinson, A.J. ; Niranjan, Mahesan ; Fallside, F.

  • Author_Institution
    Dept. of Eng., Cambridge Univ., UK
  • fYear
    1989
  • fDate
    0-0 1989
  • Abstract
    Summary form only given, as follows. Gradient descent has been used with much success to train connectionist models in the form of the error propagation networks of Rumelhart, Hinton, and Williams. In these nets the output of a node is a nonlinear function of the weighted sum of the activations of other nodes. This type of node defines a hyperplane in the input space, but other types of nodes are possible. For example, the Kanerva model, the modified Kanerva model, networks of spherical graded units, networks of localized receptive fields, and the method of radial basis functions all use nodes which define volumes in the input space. It is shown that the error propagation algorithm can be used to train general types of nodes. The example of a Gaussian node is given, and this is compared with other connectionist models for the problem of recognition of steady-state vowels from multiple speakers.<>
  • Keywords
    neural nets; optimisation; Gaussian node; Kanerva model; connectionist models; error propagation network; localized receptive fields; multiple speakers; neural nets; nodes generalisation; nonlinear function; radial basis functions; speech recognition; spherical graded units; steady-state vowels; vowel recognition; Neural networks; Optimization methods;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1989. IJCNN., International Joint Conference on
  • Conference_Location
    Washington, DC, USA
  • Type

    conf

  • DOI
    10.1109/IJCNN.1989.118343
  • Filename
    118343