Title :
An `introspective´ network that can learn to run its own weight change algorithm
Author_Institution :
Tech. Univ. Munchen, Germany
Abstract :
Usually weight changes in neural networks are exclusively caused by some hard-wired learning algorithm with many specific limitations. The author shows that it is in principle possible to let the network run and improve its own weight change algorithm (without significant theoretical limits). The author derives an initial gradient-based supervised sequence learning algorithm for an `introspective´ recurrent network that can `speak´ about its own weight matrix in terms of activations. It uses special subsets of its input and output units for observing its own errors and for explicitly analyzing and manipulating all of its own weights, including those weights responsible for analyzing and manipulating weights. The result is the first `self-referential´ neural network with explicit potential control over all adaptive parameters governing its behavior
Keywords :
learning (artificial intelligence); recurrent neural nets; adaptive parameters; initial gradient-based supervised sequence learning; neural networks; recurrent network; self referential nets; weight change algorithm;
Conference_Titel :
Artificial Neural Networks, 1993., Third International Conference on
Conference_Location :
Brighton
Print_ISBN :
0-85296-573-7