DocumentCode
288656
Title
Neural networks as function approximators: teaching a neural network to multiply
Author
Vaccari, David A. ; Wojciechowski, Edward
Author_Institution
Stevens Inst. of Technol., Hoboken, NJ, USA
Volume
4
fYear
1994
fDate
27 Jun-2 Jul 1994
Firstpage
2217
Abstract
Artificial neural networks (ANNs) were first proposed, by Hecht-Nieisen (1987), as multivariate function approximators based on Kolmogorov´s theorem. Since then, several researchers have proven that multilayer ANNs, with an arbitrary squashing function in the hidden layer, can approximate any multivariate function to any degree of accuracy. Based on these results, researchers have attempted to train backpropagation networks to realize arbitrary functions. Although their results are encouraging, this technique has many shortcomings and may lead to an inappropriate response by the network. In this paper, the authors present an alternative neural network architecture, based on cascaded univariate function approximators, which can be trained to multiply two real numbers and may be used to realize arbitrary multivariate function mappings
Keywords
backpropagation; function approximation; neural net architecture; neural nets; Kolmogorov´s theorem; backpropagation networks; cascaded univariate function approximators; hidden layer; multivariate function approximators; multivariate function mappings; neural networks; squashing function; Aerospace electronics; Artificial neural networks; Backpropagation; Education; Error correction; Function approximation; Multi-layer neural network; Network topology; Neural networks; Shape;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location
Orlando, FL
Print_ISBN
0-7803-1901-X
Type
conf
DOI
10.1109/ICNN.1994.374561
Filename
374561
Link To Document