• DocumentCode
    303202
  • Title

    Sparse initial topologies for high order perceptrons

  • Author

    Pol, A. De ; Thimm, G. ; Fiesler, E.

  • Author_Institution
    IDIAP, Martigny, Switzerland
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    84
  • Abstract
    High order neural networks are more powerful than first order neural networks. Their main drawback is the large number of connections that they entail in their default form, which is fully interlayer connected. The methods presented here produce initial topologies that only use a very small fraction of the possible connections, yet provide a good framework for successful training of high order perceptrons. There initial topologies can be refined using ontogenic techniques that modify the topology during the training of the network. The methods are based on approximating real valued data by Boolean data, which is used as a basis for constructing the network structure. The methods are evaluated for their effectiveness in reducing the network size by comparing them to fully interlayer connected high order perceptrons and their performance is evaluated by testing the generalization capabilities of the resulting networks
  • Keywords
    Boolean functions; approximation theory; generalisation (artificial intelligence); network topology; perceptrons; Boolean logic; connectionism; data approximation; generalization; high order perceptrons; ontogenic neural networks; sparse initial topology; Boolean functions; Combinatorial mathematics; Equations; Explosions; Network topology; Neural networks; Neurons; Polynomials; Splicing; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.548871
  • Filename
    548871