Title :
Learning by gradient descent in function space
Author_Institution :
Dept. of Comput. Sci., Wisconsin Univ., Madison, WI, USA
Abstract :
The use of connectionist networks in which each node executes a different function to achieve efficient supervised learning is demonstrated. A modified backpropagation algorithm for such networks, which performs gradient descent in function space, is presented, and its advantages are discussed. The benefits of the suggested paradigm include faster learning and ease of interpretation of the trained network. The potential for combining this approach with other related approaches, including traditional backpropagation or reweighting is explored
Keywords :
learning systems; neural nets; backpropagation; connectionist networks; function space; gradient descent; learning systems; neural nets; reweighting; supervised learning; Artificial neural networks; Casting; Circuit optimization; Computer networks; Integrated circuit interconnections; Intelligent networks; Neurons; Supervised learning; Temperature; Tunable circuits and devices;
Conference_Titel :
Systems, Man and Cybernetics, 1990. Conference Proceedings., IEEE International Conference on
Conference_Location :
Los Angeles, CA
Print_ISBN :
0-87942-597-0
DOI :
10.1109/ICSMC.1990.142101