• DocumentCode
    830082
  • Title

    Conventional modeling of the multilayer perceptron using polynomial basis functions

  • Author

    Chen, Mu-Song ; Manry, Michael T.

  • Author_Institution
    Dept. of Electr. Eng., Texas Univ., Arlington, TX, USA
  • Volume
    4
  • Issue
    1
  • fYear
    1993
  • fDate
    1/1/1993 12:00:00 AM
  • Firstpage
    164
  • Lastpage
    166
  • Abstract
    A technique for modeling the multilayer perceptron (MLP) neural network, in which input and hidden units are represented by polynomial basis functions (PBFs), is presented. The MLP output is expressed as a linear combination of the PBFs and can therefore be expressed as a polynomial function of its inputs. Thus, the MLP is isomorphic to conventional polynomial discriminant classifiers or Volterra filters. The modeling technique was successfully applied to several trained MLP networks
  • Keywords
    neural nets; polynomials; hidden units; input units; modeling; multilayer perceptron; neural network; polynomial basis functions; Algorithm design and analysis; Filters; Image processing; Multi-layer neural network; Multilayer perceptrons; Neural networks; Nonlinear equations; Pattern recognition; Polynomials; Signal processing;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.182712
  • Filename
    182712