• DocumentCode
    2797462
  • Title

    Optimal Size of a Feedforward Neural Network: How Much does it Matter?

  • Author

    Wang, Lipo ; Quek, Hou Chai ; Tee, Keng Hoe ; Zhou, Nina ; Wan, Chunru

  • Author_Institution
    Coll. of Inf. Eng., Xiangtan Univ.
  • fYear
    2005
  • fDate
    23-28 Oct. 2005
  • Firstpage
    69
  • Lastpage
    69
  • Abstract
    In this paper, we attempt to answer the following question with systematic computer simulations: for the same validation error rate, does the size of a feedforward neural network matter? This is related to the so-called Occam´s Razor, that is, with all things being equal, the simplest solution is likely to work the best. Our simulation results indicate that for the same validation error rate, smaller networks do not tend to work better than larger networks, that is, Occam´s Razor does not seem to apply to feedforward neural networks. In fact, our results show no trend between network size and performance for a given validation error
  • Keywords
    feedforward neural nets; Occam´s Razor; feedforward neural network; validation error rate; Artificial neural networks; Computational modeling; Computer simulation; Educational institutions; Error analysis; Feedforward neural networks; Multilayer perceptrons; Neural networks; Neurons; Training data; Hidden neurons.; Learning; Neural networks; Occam’s Razor;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Autonomic and Autonomous Systems and International Conference on Networking and Services, 2005. ICAS-ICNS 2005. Joint International Conference on
  • Conference_Location
    Papeete, Tahiti
  • Print_ISBN
    0-7695-2450-8
  • Type

    conf

  • DOI
    10.1109/ICAS-ICNS.2005.72
  • Filename
    1559921