Title :
Optimizing the learning of binary mappings
Author :
Bullinaria, John A.
Author_Institution :
Sch. of Comput. Sci., Univ. of Birmingham, UK
Abstract :
When training simple sigmoidal feed-forward neural networks on binary mappings using gradient descent algorithms with a sum-squared-error cost function, the learning algorithm often gets stuck with some outputs totally wrong. This is because the weight updates depend on the derivative of the output sigmoid which goes to zero as the output approaches maximal error. Common solutions to this problem include offsetting the output targets, offsetting the sigmoid derivatives, and using a different cost function. Comparisons are difficult because of the different optimal parameter settings for each case. In this paper I use an evolutionary approach to optimize and compare the different approaches.
Keywords :
evolutionary computation; feedforward neural nets; learning (artificial intelligence); binary mappings; evolutionary approach; gradient descent algorithms; learning optimisation; simple sigmoidal feed-forward neural networks; sum-squared-error cost function; Computer hacking; Computer science; Cost function; Entropy; Equations; Error correction; Feedforward neural networks; Feedforward systems; Neural networks; Potential well;
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
Print_ISBN :
0-7803-7898-9
DOI :
10.1109/IJCNN.2003.1224086