• DocumentCode
    2943149
  • Title

    Continual neural networks

  • Author

    Galushkin, A.I.

  • Author_Institution
    Sci. Centre of Neurocomput., Acad. of Sci., Moscow, Russia
  • Volume
    1
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    395
  • Abstract
    It is necessary to introduce many parameters describing structure and input signal of pattern recognition system during construction of open-loop structures of multilayer neural networks in order to provide maximum probability of correcting recognition in practice. Availability of great number of parameters, viz. hundreds and thousands, rouses some difficulties for learning and technical implementation of such multilayer neural network. Essence of introduction of continual properties of multilayer neural network characteristics includes the following: vector {xi, i=1, ..., I} replaces by function x(i) of continued argument, i.e. during transition to continuum of characteristic value. Transition to attributes continuum and continuum of neurons in layer is considered on the concrete examples of neural networks structures.
  • Keywords
    feedforward neural nets; pattern recognition; probability; vectors; continual neural networks; feature continuum; maximum probability; multilayer neural networks; open-loop structures; pattern recognition; vector; Artificial intelligence; Artificial neural networks; Concrete; Image sampling; Multi-layer neural network; Neural networks; Neurons; Pattern recognition; Pulse modulation; Signal generators;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.713940
  • Filename
    713940