Title :
Learned representation normalization: attention focusing with multiple input modules
Author :
Rossen, Michael L.
Author_Institution :
HNC Inc., San Diego, CA, USA
fDate :
30 Sep-1 Oct 1991
Abstract :
A large, multi-modular neural network can be envisaged for use in a complex, multi-task application. The optimum data representation for each sub-task of such an application is often unknown and different from the optimum data representation for the other sub-tasks. A method is needed that allows a network that contains several alternate input representations to learn to focus its attention on the best representation(s) for each sub-task to be learned, without a priori information on best representation-sub-task combinations. An adaptive attention focusing method is introduced that addresses this issue. The method involves training recurrent connections for each input module to selectively attenuate input to that module that causes training error in a final target module. The method is shown to have similarities with both gating networks and anti-Hebbian learning. A task scenario is proposed for which adaptive attention focusing provides superior classification performance relative to standard training methods
Keywords :
data structures; learning (artificial intelligence); recurrent neural nets; data representation; learned representation normalisation; multi-modular neural network; multi-task application; multiple input modules; recurrent connections; Algorithm design and analysis; Biological neural networks; Error analysis; Hebbian theory; Neural networks; Speech;
Conference_Titel :
Neural Networks for Signal Processing [1991]., Proceedings of the 1991 IEEE Workshop
Conference_Location :
Princeton, NJ
Print_ISBN :
0-7803-0118-8
DOI :
10.1109/NNSP.1991.239530