Title :
Neural Network Learning With Global Heuristic Search
Author :
Jordanov, I. ; Georgieva, Antoniya
Author_Institution :
Sch. of Comput., Univ. of Portsmouth
fDate :
5/1/2007 12:00:00 AM
Abstract :
A novel hybrid global optimization (GO) algorithm applied for feedforward neural networks (NNs) supervised learning is investigated. The network weights are determined by minimizing the traditional mean square error function. The optimization technique, called LPtau NM, combines a novel global heuristic search based on LPtau low-discrepancy sequences of points, and a simplex local search. The proposed method is initially tested on multimodal mathematical functions and subsequently applied for training moderate size NNs for solving popular benchmark problems. Finally, the results are analyzed, discussed, and compared with such as from backpropagation (BP) (Levenberg-Marquardt) and differential evolution methods
Keywords :
feedforward neural nets; learning (artificial intelligence); mean square error methods; optimisation; search problems; LPT low-discrepancy sequences; LPTNM optimization technique; backpropagation method; differential evolution method; feedforward neural network supervised learning; global heuristic search; hybrid global optimization algorithm; mean square error; multimodal mathematical functions; simplex local search; Backpropagation; Benchmark testing; Feedforward neural networks; Hypercubes; Machine learning; Mean square error methods; Medical diagnosis; Neural networks; Optimization methods; Supervised learning; Global optimization (GO); heuristic methods; low- discrepancy sequences; neural network (NN) learning; simplex search; Algorithms; Artificial Intelligence; Computer Simulation; Decision Support Techniques; Information Storage and Retrieval; Models, Statistical; Neural Networks (Computer); Pattern Recognition, Automated;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2007.891633