Title :
Neural network training and stochastic global optimization
Author_Institution :
Dept. of Comput. & Inf. Sci., De Montfort Univ., Leicester, UK
Abstract :
This paper investigates local minima problem in neural network (NN) backpropagation supervised learning. The proposed algorithm of training makes use of stochastic optimization technique based on so-called low-discrepancy sequences. The learning process is considered as an unconstrained optimization problem and once parameter space (defined by the NN weights) and objective functions are defined, the proposed method searches for a global optimum. First, regions of attraction as candidates for local minima are obtained, and secondly, each region is searched for locating minima and subsequently finding a global minimum. The conducted algorithm is initially tested on multimodal mathematical functions and then on common benchmark problems for NN training. The results are finally discussed and compared with such obtained from backpropagation and other methods.
Keywords :
backpropagation; neural nets; optimisation; pattern classification; backpropagation; local minima problem; low discrepancy sequences; multimodal mathematical functions; neural network; objective functions; pattern classification; stochastic global optimization; supervised learning; Backpropagation algorithms; Benchmark testing; Computer networks; Convergence; H infinity control; Hypercubes; Neural networks; Optimization methods; Stochastic processes; Supervised learning;
Conference_Titel :
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN :
981-04-7524-1
DOI :
10.1109/ICONIP.2002.1202218