Title :
Concaving Space Algorithm for Neural Network
Author :
Liu, Weiguo ; Peng, Qinke ; Huang, Yongxuan
Author_Institution :
Xi´´ an Jiaotong Univ., Xi´´an
Abstract :
The global optimum solution, convergence rate are much more concerned for neural network training process. The efficient use of information resources works noticeably in the design of training methods. In this paper, not only gradient information but also Hessian matrix resource is applied for improving the learning efficiency and stability of neural network The necessary and sufficient condition of semi-positive definite Hessian matrix is gained .So the connecting weight matrix W can be revised within concave domains. Hence, the stable distribution of weights can be reached. The algorithm guides the training process to developing towards optimum goal. Compared with standard gradient method, the oscillating divergent phenomenon is avoided. The convergence of the algorithm is accelerated.
Keywords :
Hessian matrices; learning (artificial intelligence); Hessian matrix resource; concaving space algorithm; connecting weight matrix; neural network training; Artificial neural networks; Convergence; Cybernetics; Feedforward neural networks; Feeds; Information resources; Joining processes; Neural networks; Stability; Systems engineering and theory;
Conference_Titel :
Systems, Man and Cybernetics, 2006. SMC '06. IEEE International Conference on
Conference_Location :
Taipei
Print_ISBN :
1-4244-0099-6
Electronic_ISBN :
1-4244-0100-3
DOI :
10.1109/ICSMC.2006.385100