DocumentCode :
3737760
Title :
Hyperconic multilayer perceptron for function approximation
Author :
Juan Pablo Serrano-Rubio;Rafael Herrera-Guzmán;Arturo Hernández-Aguirre
Author_Institution :
Information Technologies Laboratory, Technological Institute of Irapuato (ITESI), 36821 Irapuato Guanajuato, Mexico
fYear :
2015
Firstpage :
4702
Lastpage :
4707
Abstract :
In this paper, three different multilayer perceptrons are used to solve function approximation problems. We compare the performance of the traditional Multilayer Perceptron, the Hypersphere Multilayer Perceptron and our own Hyperconic Multilayer Perceptron in the approximation of six benchmark functions. The Hypersphere Multilayer Perceptron and the Hy-perconic Multilayer Perceptron architectures include neurons whose activation transfer functions produce non-linear (non-piecewise linear) boundaries. The goal of this paper is to evaluate the advantages of the neural network models, whether they need a greater or smaller number of neurons in the hidden layer, with particular interest is higher order neurons, in order to approximate continuous functions. The training of the neural network models is performed by using an evolutionary algorithm designed by the authors and which is based on spherical inversions. The non-linear nature of the reproduction operator of the evolutionary algorithm furnishes an enhanced search capability to obtain the global optimum during the training stage.
Keywords :
"Neurons","Multilayer perceptrons","Biological neural networks","Topology","Evolutionary computation","Training","Network topology"
Publisher :
ieee
Conference_Titel :
Industrial Electronics Society, IECON 2015 - 41st Annual Conference of the IEEE
Type :
conf
DOI :
10.1109/IECON.2015.7392834
Filename :
7392834
Link To Document :
بازگشت