Title :
A stochastic method for minimizing functions with many minima
Author :
Ye, Hong ; Lin, Zhiping
Author_Institution :
Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore
Abstract :
An efficient stochastic method for continuous optimization problems is presented. Combining a novel global search with typical local optimization methods, the proposed method specializes in hard optimization problems such as minimizing multimodal or ill-conditioned unimodal objective functions. Extensive numerical studies show that, starting from a random initial point, the proposed method is always to find the global optimal solution. Computational results in comparison with other global optimization algorithms clearly illustrate the efficiency and accuracy of the method. As traditional supervised neural-network training is formulated as a continuous optimization problem, the method presented can be applied to neural-network learning.
Keywords :
convergence of numerical methods; learning (artificial intelligence); minimisation; neural nets; stochastic processes; continuous optimization; continuous optimization problem; convergence properties; efficient stochastic method; global optimal solution; global optimization algorithms; global search; hard optimization problems; ill-conditioned unimodal objective functions; local optimization methods; minima; minimization; multimodal objective functions; neural-network learning; stochastic method; supervised neural-network training; Algorithm design and analysis; Computational modeling; Genetics; Least squares methods; Minimization methods; Newton method; Optimization methods; Recursive estimation; Simulated annealing; Stochastic processes;
Conference_Titel :
Neural Networks for Signal Processing, 2002. Proceedings of the 2002 12th IEEE Workshop on
Print_ISBN :
0-7803-7616-1
DOI :
10.1109/NNSP.2002.1030040