Title :
Learning by parallel forward propagation
Abstract :
The back-propagation algorithm is widely used for learning weights of multilayered neural networks. The major drawbacks however, are the slow convergence and lack of a proper way to set the number of hidden neurons. The author proposes a learning algorithm which solves the above problems. The weights between two layers are successively calculated, fixing other weights so that the error function, which is the square sum of the difference between the training data and the network outputs is minimised. Since the calculation results in solving a set of linearized equations, redundancy of the hidden neurons is judged by the singularity of the corresponding coefficient matrix. For the exclusive-OR and parity check circuits, excellence convergence characteristics are obtained, and the redundancy of the hidden neurons is checked by the singularity of the matrix
Keywords :
learning systems; neural nets; parallel algorithms; parallel architectures; coefficient matrix; convergence characteristics; error function; exclusive-OR; hidden neurons; learning algorithm; linearized equations; multilayered neural networks; parallel forward propagation; parity check circuits; redundancy; singularity; training data; weights;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137830