DocumentCode :
1930474
Title :
Optimizing the learning of binary mappings
Author :
Bullinaria, John A.
Author_Institution :
Sch. of Comput. Sci., Univ. of Birmingham, UK
Volume :
4
fYear :
2003
fDate :
20-24 July 2003
Firstpage :
3207
Abstract :
When training simple sigmoidal feed-forward neural networks on binary mappings using gradient descent algorithms with a sum-squared-error cost function, the learning algorithm often gets stuck with some outputs totally wrong. This is because the weight updates depend on the derivative of the output sigmoid which goes to zero as the output approaches maximal error. Common solutions to this problem include offsetting the output targets, offsetting the sigmoid derivatives, and using a different cost function. Comparisons are difficult because of the different optimal parameter settings for each case. In this paper I use an evolutionary approach to optimize and compare the different approaches.
Keywords :
evolutionary computation; feedforward neural nets; learning (artificial intelligence); binary mappings; evolutionary approach; gradient descent algorithms; learning optimisation; simple sigmoidal feed-forward neural networks; sum-squared-error cost function; Computer hacking; Computer science; Cost function; Entropy; Equations; Error correction; Feedforward neural networks; Feedforward systems; Neural networks; Potential well;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
ISSN :
1098-7576
Print_ISBN :
0-7803-7898-9
Type :
conf
DOI :
10.1109/IJCNN.2003.1224086
Filename :
1224086
Link To Document :
بازگشت