• DocumentCode
    396697
  • Title

    Regularization and Feedforward artificial neural network training with noise

  • Author

    Chandra, Pravin ; Singh, Yogesh

  • Author_Institution
    Sch. of Inf. Technol., G.G.S. Indraprastha Univ., Delhi, India
  • Volume
    3
  • fYear
    2003
  • fDate
    20-24 July 2003
  • Firstpage
    2366
  • Abstract
    Regularization is a method used for controlling the complexity of models. Explicit regularization uses a modifier term, incorporating a-priori knowledge about the function to be approximated by Feedforward Artificial Networks, that is added to the risk functional and implicit regularization where noise is added to the system variables during training, are two of the commonly used techniques for model complexity control. The relationship between these two type of regularization is explained. A regularization term is derived based on the general noise model. The interplay between the various noise mediated regularization terms is described.
  • Keywords
    circuit noise; computational complexity; feedforward neural nets; function approximation; learning (artificial intelligence); feedforward artificial neural network training; function approximation; general noise model; model complexity control; regularization; risk function; Artificial neural networks; Backpropagation algorithms; Computer errors; Euclidean distance; Information technology; Input variables; Minimization methods; Phase noise; Supervised learning; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223782
  • Filename
    1223782