DocumentCode :
2753334
Title :
A rigorous result about the off-line learning approximation
Author :
Benaim, M. ; Samuelides, M.
fYear :
1991
fDate :
8-14 Jul 1991
Abstract :
Summary form only given, as follows. The authors consider a mathematical justification of the offline approximation in continuous-time neural networks. In real-time models, a network behavior is characterized by two distinct dynamics evolving according to different time scales, the weight dynamics which is the `slow´ dynamics and the activation dynamics which is the `fast´ dynamics. The offline approximation consists in assuming that during the learning process neural activities are in their steady states. Such an approximation is a common dogma often used to provide an analysis of a network behavior. The authors considered convergent networks and proved that this approximation is valid on a large time scale on the order of 1/ε where ε is the learning rate parameter which controls the learning velocity
Keywords :
dynamics; learning systems; neural nets; activation dynamics; continuous-time neural networks; convergent networks; learning process; machine learning; offline approximation; time scales; weight dynamics; Biological control systems; Biological system modeling; Displays; Educational institutions; Neural networks; Neurons; Pattern classification; Pattern recognition; Pulse generation; Velocity control;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155637
Filename :
155637
Link To Document :
بازگشت