• DocumentCode
    3097348
  • Title

    Learning with nonstatic paradigms in neural networks

  • Author

    Köhle, Monika ; Schönbauer, Franz

  • Author_Institution
    Tech. Univ. of Vienna, Austria
  • fYear
    1989
  • fDate
    10-12 Apr 1989
  • Firstpage
    72
  • Lastpage
    75
  • Abstract
    Schemata, concepts, or any kind of knowledge contained in a neural network is usually represented in the interconnections of the constituent units. Learning in neural networks is regarded as modifying the strength of these interconnections. The authors define higher structured learning as not only modifying the weights but also the overall topology of the net, which implies that knowledge in a neural network is also represented in the architecture of the net. In the static paradigm, the definition of a neural network is similar to a variable declaration in a block-oriented language. The overall topology of the network is defined before learning takes place and is not altered during the learning phase. During learning only the values of the weights change. In a nonstatic paradigm units can be created at any time and can be connected arbitrarily with other units of the net. The authors demonstrate that using a nonstatic approach can improve learning
  • Keywords
    learning systems; neural nets; architecture; block-oriented language; concepts; constituent units; higher structured learning; interconnections; knowledge; neural networks; nonstatic paradigms; schemata; static paradigm; topology; variable declaration; weights; Feature extraction; Feeds; Intelligent networks; Network topology; Neural networks; Pattern recognition; Performance evaluation; Supervised learning; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Industrial Applications of Machine Intelligence and Vision, 1989., International Workshop on
  • Conference_Location
    Tokyo
  • Type

    conf

  • DOI
    10.1109/MIV.1989.40525
  • Filename
    40525