• DocumentCode
    329050
  • Title

    Noise robustness of EBNN learning

  • Author

    Masuoka, Ryusuke

  • Author_Institution
    Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA
  • Volume
    2
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    1665
  • Abstract
    A variety of methods have recently been proposed for constraining neural networks to fit various constraints while being trained. One such approach is to constrain the function approximated by the network to fit desired slopes, or derivatives. Such slopes may be provided by the designer, as in Simard´s character recognizer network which was constrained so that the slope of the output with respect to translations, rotations, etc. of the input should be zero. Alternatively, target slopes may be generated automatically by program as in explanation based neural network (EBNN) learning. While slope information is known to improve generalization, sometimes slope information as well as value information is corrupted by noise. This paper explores the effects of noise in value and slope information on EBNN learning, compared with standard backpropagation. Experimental results show several characteristics of noise robustness of EBNN learning.
  • Keywords
    constraint handling; explanation; information theory; learning (artificial intelligence); neural nets; noise; constraints; explanation based neural network; learning; noise effects; slope information; value information; Backpropagation; Character recognition; Computer science; Electronic mail; Equations; Function approximation; Jacobian matrices; Neural networks; Noise robustness; Pattern recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.716972
  • Filename
    716972