DocumentCode :
3032020
Title :
Learning by gradient descent in function space
Author :
Mani, Ganesh
Author_Institution :
Dept. of Comput. Sci., Wisconsin Univ., Madison, WI, USA
fYear :
1990
fDate :
4-7 Nov 1990
Firstpage :
242
Lastpage :
247
Abstract :
The use of connectionist networks in which each node executes a different function to achieve efficient supervised learning is demonstrated. A modified backpropagation algorithm for such networks, which performs gradient descent in function space, is presented, and its advantages are discussed. The benefits of the suggested paradigm include faster learning and ease of interpretation of the trained network. The potential for combining this approach with other related approaches, including traditional backpropagation or reweighting is explored
Keywords :
learning systems; neural nets; backpropagation; connectionist networks; function space; gradient descent; learning systems; neural nets; reweighting; supervised learning; Artificial neural networks; Casting; Circuit optimization; Computer networks; Integrated circuit interconnections; Intelligent networks; Neurons; Supervised learning; Temperature; Tunable circuits and devices;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Man and Cybernetics, 1990. Conference Proceedings., IEEE International Conference on
Conference_Location :
Los Angeles, CA
Print_ISBN :
0-87942-597-0
Type :
conf
DOI :
10.1109/ICSMC.1990.142101
Filename :
142101
Link To Document :
بازگشت