Title :
Method of digging tunnels horizontally into the error hypersurface to speed up training and to escape from local minima
Author :
Liang, Xun ; Xia, Shaowei ; Du, Jihong
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing, China
fDate :
27 Jun-2 Jul 1994
Abstract :
In this paper, a general compressing method is reviewed systematically at first. Then the idea and steps of digging horizontally into the error hypersurface are presented, as well as an example. Since there exists serious and complex nonlinearity in the error hypersurface, training by gradient descending techniques is often too slow when it is on a plateau and has the risk of trapping into local minima. Digging tunnels into the error hypersurface by means of rotation transformation will lead to the plateau to speed up the training or skip from local minima
Keywords :
data compression; learning (artificial intelligence); multilayer perceptrons; transforms; compressing method; error hypersurface; gradient descending method; hidden neuron; learning speed; local minima; multilayer perceptron; neural nets; rotation transformation; tunnel digging; Multilayer perceptrons; Neurons;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374159