DocumentCode :
3149605
Title :
Hybrid methods for numerical optimization problems
Author :
Min-jea Tahk ; Ohta, Yoshichika
Author_Institution :
Korea Adv. Inst. of Sci. & Technol., Daejeon
fYear :
2008
fDate :
20-22 Aug. 2008
Firstpage :
17
Lastpage :
18
Abstract :
This presentation introduces hybrid optimization algorithms, which combine evolutionary algorithms(EA) and the gradient search technique, for optimization with continuous parameters. Inheriting the advantages of the two approaches, the new methods are fast and capable of global search. The key feature of the proposed hybrid methods is that gradient search becomes effective only when the solution region is found and local search with fast convergence is needed. Thus, transition from global search to local search is accomplished implicitly and automatically. The main structure of the new method is similar to that of EA but a special individual called gradient individual is introduced and EA individuals are located symmetrically. The gradient individual is propagated through generations by means of the quasi-Newton method modified for hybrid optimization. Gradient information is calculated from the costs of EA individuals produced by evolution strategy(ES) so that no extra computational burden is required. For estimation of the inverse Hessian matrix, two approaches have been studied: The first approach is based on the classical quasi-Newton algorithms, among which Symmetric Rank-1(SRO) update shows better performance than BFGS and DFP. The second approach is to estimate the Hessian matrix directly from the fitness values of the evolutionary population, rather than using estimates of the gradient vector. Two new algorithms developed for the second approach exhibit more stable Hessian estimation than the quasi-Newton algorithms. Numerical test on various benchmark problems demonstrate that the new hybrid algorithms give faster convergence rate than EA, without sacrificing capability of global search.
Keywords :
Hessian matrices; convergence of numerical methods; estimation theory; evolutionary computation; optimisation; search problems; Hessian estimation; evolutionary algorithms; evolutionary population; fast convergence; gradient search technique; hybrid optimization algorithms; inverse Hessian matrix; numerical optimization problems; quasi-Newton method; Aerospace engineering; Automotive engineering; Control systems; Evolutionary computation; Haptic interfaces; Optimization methods; Robotics and automation; Speech; Symmetric matrices; USA Councils;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
SICE Annual Conference, 2008
Conference_Location :
Tokyo
Print_ISBN :
978-4-907764-30-2
Electronic_ISBN :
978-4-907764-29-6
Type :
conf
DOI :
10.1109/SICE.2008.4654606
Filename :
4654606
Link To Document :
بازگشت