Title :
Kolmogorov´s spline network
Author :
Igelnik, Boris ; Parikh, Neel
Author_Institution :
Pegasus Technol. Inc., Mentor, OH, USA
fDate :
7/1/2003 12:00:00 AM
Abstract :
In this paper, an innovative neural-network architecture is proposed and elucidated. This architecture, based on the Kolmogorov´s superposition theorem (1957) and called the Kolmogorov´s spline network (KSN), utilizes more degrees of adaptation to data than currently used neural-network architectures (NNAs). By using cubic spline technique of approximation, both for activation and internal functions, more efficient approximation of multivariate functions can be achieved. The bound on approximation error and number of adjustable parameters, derived in this paper, favorably compares KSN with other one-hidden layer feedforward NNAs. The training of KSN, using the ensemble approach and the ensemble multinet, is described. A new explicit algorithm for constructing cubic splines is presented.
Keywords :
neural net architecture; splines (mathematics); transfer functions; KSN; Kolmogorov spline network; adjustable parameter bound; approximation error bound; cubic spline technique; internal functions; multivariate function approximation; neural-network architecture; one-hidden layer feedforward NNA; superposition theorem; Approximation error; Combustion; Computer networks; Feedforward neural networks; Function approximation; Hypercubes; Neural networks; Power industry; Quantum computing; Spline;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2003.813830