• DocumentCode
    312100
  • Title

    A novel multi-type architecture for FANNs

  • Author

    Ho, K.C. ; Ponnapalli, P.V.S. ; Thomson, M.

  • Author_Institution
    Dept. of Electr. & Electron. Eng., Manchester Metroplitan Univ., UK
  • fYear
    1997
  • fDate
    7-9 Jul 1997
  • Firstpage
    239
  • Lastpage
    244
  • Abstract
    This paper presents a novel multi-type architecture for feedforward artificial neural networks (FANNs) which offers improved speed of convergence, reduced computational complexity and improved generalization ability. The proposed architecture incorporates at least one linear node in the hidden layer. Theoretical analysis is presented to compare the rate of change of weights associated with nonlinear and linear hidden node connections. Simulation results presented demonstrate that the new architecture can significantly improve convergence and also reduce computational time of FANN training with better generalization capability. Such architecture can be extremely useful for on-line training of FANNs
  • Keywords
    computational complexity; FANN training; FANNs; backpropagation; computational complexity; computational time; convergence; feedforward artificial neural networks; generalization; hidden layer; hidden node connections; multi-type architecture;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Artificial Neural Networks, Fifth International Conference on (Conf. Publ. No. 440)
  • Conference_Location
    Cambridge
  • ISSN
    0537-9989
  • Print_ISBN
    0-85296-690-3
  • Type

    conf

  • DOI
    10.1049/cp:19970733
  • Filename
    607524