DocumentCode :
286727
Title :
An `introspective´ network that can learn to run its own weight change algorithm
Author :
Schmidhuber, J.
Author_Institution :
Tech. Univ. Munchen, Germany
fYear :
1993
fDate :
25-27 May 1993
Firstpage :
191
Lastpage :
194
Abstract :
Usually weight changes in neural networks are exclusively caused by some hard-wired learning algorithm with many specific limitations. The author shows that it is in principle possible to let the network run and improve its own weight change algorithm (without significant theoretical limits). The author derives an initial gradient-based supervised sequence learning algorithm for an `introspective´ recurrent network that can `speak´ about its own weight matrix in terms of activations. It uses special subsets of its input and output units for observing its own errors and for explicitly analyzing and manipulating all of its own weights, including those weights responsible for analyzing and manipulating weights. The result is the first `self-referential´ neural network with explicit potential control over all adaptive parameters governing its behavior
Keywords :
learning (artificial intelligence); recurrent neural nets; adaptive parameters; initial gradient-based supervised sequence learning; neural networks; recurrent network; self referential nets; weight change algorithm;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1993., Third International Conference on
Conference_Location :
Brighton
Print_ISBN :
0-85296-573-7
Type :
conf
Filename :
263229
Link To Document :
بازگشت