• DocumentCode
    2067320
  • Title

    Well-balanced learning for reducing the variance of summed squared errors

  • Author

    Kohara, Kazuhiro ; Kawaoka, Tsukasa

  • Author_Institution
    NTT Network Inf. Syst. Lab., Tokyo, Japan
  • fYear
    1993
  • fDate
    24-26 Nov 1993
  • Firstpage
    29
  • Lastpage
    33
  • Abstract
    The authors examined how a limited number of training patterns can be used to improve the generalization ability of a backpropagation neural network (BPNN). First, they explain the problem with the conventional learning technique, in which only the mean summed squared error (MSSE) is observed as a BPNN learning stopping criterion. The proposed well-balanced learning (WBL) technique observes not only the MSSE, but also the individual summed squared errors of the training patterns. A BPNN is thereby trained with a smaller deviation than in conventional learning, thus improving the network´s generalization ability. The effectiveness of WBL is shown by evaluation experiments
  • Keywords
    backpropagation; generalisation (artificial intelligence); learning (artificial intelligence); neural nets; backpropagation neural network; generalization; generalization ability; learning stopping criterion; mean summed squared error; summed squared error variance; summed squared errors; training; training patterns; variance reduction; well-balanced learning; Electronic mail; Error correction; Handwriting recognition; Information systems; Laboratories; Neural networks; Pattern recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Artificial Neural Networks and Expert Systems, 1993. Proceedings., First New Zealand International Two-Stream Conference on
  • Conference_Location
    Dunedin
  • Print_ISBN
    0-8186-4260-2
  • Type

    conf

  • DOI
    10.1109/ANNES.1993.323089
  • Filename
    323089