• DocumentCode
    3573443
  • Title

    Improved neural network training using redundant structure

  • Author

    Yang, Yingjie ; Hinde, Chris ; Gillingwater, David

  • Author_Institution
    Center for Comput. Intelligence, De Montfort Univ., Leicester, UK
  • Volume
    3
  • fYear
    2003
  • Firstpage
    2023
  • Abstract
    It is a common understanding in neural network research and applications that a network with fewer redundant nodes is more reliable. This paper argues that a redundant network structure approach improves the learning process of neural networks. This redundant structure is shown to be free from extra parameters and hence does not introduce additional uncertainty. Using a small partition problem, the training results of standard BP networks are compared with those networks with a redundant structure. The comparison shows that a redundant structure does not necessarily always have a negative effect, and as a result it is possible to help a neural network obtain better performance.
  • Keywords
    learning (artificial intelligence); neural nets; backpropagation; learning process; neural network research; neural network training; redundant network structure; redundant node; standard BP networks; Application software; Buildings; Computational intelligence; Computer science; Insects; Intelligent structures; Joining processes; Neural networks; Performance analysis; Uncertainty;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223718
  • Filename
    1223718