Title :
Acceleration of back-propagation for learning the forward and inverse kinematic equations
Author :
Ramdane-Cherif, Amar ; Perdereau, Véronique ; Drouin, Michel
Author_Institution :
Paris VI Univ., France
Abstract :
Learning in the context of neural networks means finding a set of synaptic weights that make the network perform the desired function. Backpropagation it has two major drawbacks in its leaning efficiency such as slow learning speed and convergence to local minima. In this paper, the 1D minimization with respect to learning rate has been incorporated into backpropagation algorithm. Various techniques of a 1D optimization have been developed to adjust the order of learning rate during training, such as Goldstein method, Wolfe-Powell method, and dichotomy method. These methods under the backpropagation algorithm are used to learn forward and inverse kinematic equations of two degrees of freedom arm robot manipulator. The comparative study presented in this paper compares by simulation these different methods with respectively the standard backpropagation algorithm and the optimal gradient method. The simulation results show that the gradient method combined with the Goldstein or Wolfe-Powell method gives the best performance and faster minimization of the criterion
Keywords :
backpropagation; neural nets; optimisation; robot kinematics; 1D optimization; Goldstein method; Wolfe-Powell method; backpropagation; dichotomy method; forward kinematic equations; inverse kinematic equations; learning rate; optimal gradient method; robot manipulator; Acceleration; Backpropagation algorithms; Convergence; Equations; Gradient methods; Kinematics; Minimization methods; Neural networks; Optimization methods; Robots;
Conference_Titel :
Neuroinformatics and Neurocomputers, 1995., Second International Symposium on
Conference_Location :
Rostov on Don
Print_ISBN :
0-7803-2512-5
DOI :
10.1109/ISNINC.1995.480866