Title :
An error perturbation for learning and detection of local minima in binary 3-layered neural networks
Author :
Yatsuzuka, Yohtaro
Author_Institution :
Res. & Dev. Lab., Kokusai Denshin Denwa Co. Ltd., Kamifukuoka, Japan
Abstract :
In binary multilayer neural networks with a backpropagation algorithm, achievement of quick and stable convergence in binary space is a major issue for a wide range of applications. We propose a learning technique in which tenacious local minima can be evaded by using a perturbation of the unit output errors in an output layer in polarity and magnitude. Simulation results showed that a binary 3-layered neural network can converge very rapidly in binary space with insensitivity to a set of initial weights, providing high generalization ability. It is also pointed out that tenacious local minima can be detected by monitoring a minimum magnitude of the unit output errors for the erroneous binary outputs, and that the overtraining concerning to generalization performance for test inputs is roughly estimated by monitoring the minimum and maximum magnitudes of the unit output errors for the correct binary outputs
Keywords :
backpropagation; convergence; feedforward neural nets; generalisation (artificial intelligence); minimax techniques; perturbation techniques; backpropagation; binary multilayer neural networks; binary space; convergence; error perturbation; generalization; learning technique; local minima detection; output errors; overtraining; Artificial neural networks; Backpropagation algorithms; Convergence; Error correction; Fault diagnosis; Intelligent networks; Knowledge acquisition; Monitoring; Multi-layer neural network; Neural networks; Testing;
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
DOI :
10.1109/ICNN.1995.487878