• DocumentCode
    1748797
  • Title

    Neural network training for varying output node dimension

  • Author

    Jung, Jae-Byung ; El-Sharkawi, M.A. ; Marks, R.J., II ; Miyamoto, Robert ; Fox, Warren L J ; Anderson, G.M. ; Eggen, C.J.

  • Author_Institution
    Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
  • Volume
    3
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    1733
  • Abstract
    Considers the problem of neural network supervised learning when the number of output nodes can vary for differing training data. The paper proposes irregular weight updates and learning rate adjustment to compensate for this variation. In order to compensate for possible over training, an a posteriori probability that shows how often the weights associated with each output neuron are updated is obtained from the training data set and is used to evenly distribute the opportunity for weight update to each output neuron. The weight space becomes smoother and the generalization performance is significantly improved
  • Keywords
    generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; probability; a posteriori probability; generalization performance; irregular weight updates; learning rate adjustment; neural network supervised learning; neural network training; varying output node dimension; weight space; Computational intelligence; Filling; Laboratories; Multilayer perceptrons; Network topology; Neural networks; Neurons; Physics; Supervised learning; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.938423
  • Filename
    938423