Title :
Stochastic choice of basis functions in adaptive function approximation and the functional-link net
Author :
Igelnik, Bons ; Pao, Yoh-Han
Author_Institution :
Dept. of Electr. Eng. & Appl. Phys., Case Western Reserve Univ., Cleveland, OH, USA
fDate :
11/1/1995 12:00:00 AM
Abstract :
A theoretical justification for the random vector version of the functional-link (RVFL) net is presented in this paper, based on a general approach to adaptive function approximation. The approach consists of formulating a limit-integral representation of the function to be approximated and subsequently evaluating that integral with the Monte-Carlo method. Two main results are: (1) the RVFL is a universal approximator for continuous functions on bounded finite dimensional sets, and (2) the RVFL is an efficient universal approximator with the rate of approximation error convergence to zero of order O(C/√n), where n is number of basis functions and with C independent of n. Similar results are also obtained for neural nets with hidden nodes implemented as products of univariate functions or radial basis functions. Some possible ways of enhancing the accuracy of multivariate function approximations are discussed
Keywords :
Monte Carlo methods; adaptive systems; feedforward neural nets; function approximation; Monte-Carlo method; adaptive function approximation; approximation error convergence rate; bounded finite dimensional sets; continuous function approximation; efficient universal approximator; functional-link net; limit-integral representation; multivariate function approximations; neural nets; radial basis functions; random vector version; stochastic choice; univariate function products; Approximation error; Convergence; Function approximation; Helium; Hypercubes; Integral equations; Neural networks; Random variables; Stochastic processes; Writing;
Journal_Title :
Neural Networks, IEEE Transactions on