DocumentCode :
816286
Title :
An Optimization Methodology for Neural Network Weights and Architectures
Author :
Ludermir, T.B. ; Yamazaki, A. ; Zanchettin, C.
Author_Institution :
Center of Informatics, Univ. Fed. de Pernambuco
Volume :
17
Issue :
6
fYear :
2006
Firstpage :
1452
Lastpage :
1459
Abstract :
This paper introduces a methodology for neural network global optimization. The aim is the simultaneous optimization of multilayer perceptron (MLP) network weights and architectures, in order to generate topologies with few connections and high classification performance for any data sets. The approach combines the advantages of simulated annealing, tabu search and the backpropagation training algorithm in order to generate an automatic process for producing networks with high classification performance and low complexity. Experimental results obtained with four classification problems and one prediction problem has shown to be better than those obtained by the most commonly used optimization techniques
Keywords :
backpropagation; multilayer perceptrons; neural net architecture; search problems; simulated annealing; backpropagation training algorithm; global optimization; multilayer perceptron network; neural network architectures; neural network weights; simulated annealing; tabu search; Backpropagation algorithms; Cost function; Design optimization; Genetic algorithms; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural networks; Optimization methods; Simulated annealing; Multilayer perceptron (MLP); optimization of weights and architectures; simulating annealing; tabu search; Algorithms; Information Storage and Retrieval; Neural Networks (Computer); Pattern Recognition, Automated; Signal Processing, Computer-Assisted;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2006.881047
Filename :
4012033
Link To Document :
بازگشت