• DocumentCode
    1946341
  • Title

    Upper Bound on Pattern Storage in Feedforward Networks

  • Author

    Narasimha, Pramod L. ; Manry, Michael T. ; Maldonado, Francisco

  • Author_Institution
    Univ. of Texas, Arlington
  • fYear
    2007
  • fDate
    12-17 Aug. 2007
  • Firstpage
    1714
  • Lastpage
    1719
  • Abstract
    Starting from the strict interpolation equations for multivariate polynomials, an upper bound is developed for the number of patterns that can be memorized by a nonlinear feedforward network. A straightforward proof by contradiction is presented for the upper bound. It is shown that the hidden activations do not have to be analytic. Networks, trained by conjugate gradient, are used to demonstrate the tightness of the bound for random patterns. Based upon the upper bound, small multilayer perceptron models are successfully demonstrated for large support vector machines.
  • Keywords
    conjugate gradient methods; interpolation; multilayer perceptrons; support vector machines; conjugate gradient; hidden activations; interpolation equations; multilayer perceptron models; multivariate polynomials; nonlinear feedforward network; pattern storage; support vector machines; upper bound; Feedforward neural networks; Interpolation; Multilayer perceptrons; Neural networks; Nonlinear equations; Polynomials; Shape; Support vector machines; Upper bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2007. IJCNN 2007. International Joint Conference on
  • Conference_Location
    Orlando, FL
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-1379-9
  • Electronic_ISBN
    1098-7576
  • Type

    conf

  • DOI
    10.1109/IJCNN.2007.4371216
  • Filename
    4371216