Title :
A sequential adder using recurrent networks
Author :
Tsung, Fu-Sheng ; Cottrell, Garrison W.
Author_Institution :
Dept. of Comput. Sci. & Eng., California Univ., San Diego, CA, USA
Abstract :
D.E. Rumelhart et al.´s proposal (1986) of how symbolic processing is achieved in PDP (parallel distributed processing) networks is tested by training two types of recurrent networks to learn to add two numbers of arbitrary lengths. A method of combining old and new training sets is developed which enables the network to learn and generalize with very large training sets. Through this model of addition, these networks demonstrated capability to do simple conditional branching, while loops, and sequences, mechanisms essential for a universal computer. Differences between the two types of recurrent networks are discussed, as well as implications for human learning.<>
Keywords :
adders; learning systems; neural nets; parallel architectures; combined subset training; conditional branching; learning; neural networks; parallel distributed processing; recurrent networks; sequential adder; symbolic processing; Adders; Learning systems; Neural networks; Parallel architectures;
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
DOI :
10.1109/IJCNN.1989.118690