DocumentCode
329063
Title
Generalization in a perceptron with a sigmoid transfer function
Author
Ha, Sanghun ; Kang, Kukjin ; Oh, JongHoon ; Kwon, Chulan ; Park, Youngah
Author_Institution
Dept. of Phys., Pohang Inst. of Sci. & Technol., South Korea
Volume
2
fYear
1993
fDate
25-29 Oct. 1993
Firstpage
1723
Abstract
Learning of layered neural networks is studied using the methods of statistical mechanics. Networks are trained from examples using the Gibbs algorithm. We focus on the generalization curve, i.e. the average generalization error as a function of the number of the examples. We consider perceptron learning with a sigmoid transfer function. Ising perceptrons, with weights constrained to be discrete, exhibit sudden learning at low temperatures within the annealed approximation. There is a first order transition from a state of poor generalization to a state of perfect generalization. When the transfer function is smooth, the first order transition occurs only at low temperatures. The transition becomes continuous at high temperatures. When the transfer function is steep, the first order transition line is extended to the higher temperature. The analytic results show a good agreement with the computer simulations.
Keywords
approximation theory; function approximation; generalisation (artificial intelligence); learning by example; multilayer perceptrons; simulated annealing; statistical analysis; transfer functions; Gibbs algorithm; Ising perceptrons; annealed approximation; generalization; layered neural networks; learning from examples; perceptron; sigmoid transfer function; statistical mechanics; Annealing; Computer simulation; Feedforward neural networks; Intelligent networks; Linearity; Neural networks; Neurons; Physics; Temperature; Transfer functions;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN
0-7803-1421-2
Type
conf
DOI
10.1109/IJCNN.1993.716986
Filename
716986
Link To Document