• DocumentCode
    329055
  • Title

    Explicit solution of the optimum weights of multilayer perceptron: the binary input case

  • Author

    Yu, Xiao-Hu ; Guo, Qiang

  • Author_Institution
    Nat. Commun. Res. Lab., Southeast Univ., Nanjing, China
  • Volume
    2
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    1689
  • Abstract
    Explicit solutions of multilayer feedforward networks have previously been discussed by Yu (1992). This paper extends the similar idea to binary input case. We show that the hidden units can be used to form the basis functions of the binary Walsh transform and the network training can therefore be treated as finding the coefficients of the binary Walsh expansion of the desired mapping, thus making the optimum weights explicitly solvable. For the incomplete training set case, a useful approach is presented to assure the resultant network having smooth generalization performance. Noise rejection performance of the obtained network is also illustrated.
  • Keywords
    Walsh functions; feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; transforms; binary Walsh transform; binary input; generalization; hidden units; mapping; multilayer feedforward networks; multilayer perceptron; network learning; noise rejection; optimum weights; Backpropagation algorithms; Computer aided software engineering; Equations; Multilayer perceptrons; Nonhomogeneous media; Radar; Sufficient conditions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.716978
  • Filename
    716978