DocumentCode :
2435336
Title :
Extensions to projection pursuit learning networks with parametric smoothers
Author :
Lay, Shyh-Rong ; Hwang, Jenq-Neng ; You, Shih-Shien
Author_Institution :
Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
Volume :
3
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
1325
Abstract :
A neural network which can grow its own structure on the training attracts a lot of research. Projection pursuit learning networks (PPLNs) and cascaded correlation learning networks (CCLNs) are two such neural networks. Unlike a CCLN where cascaded connections from the existing hidden units to the new candidate hidden unit are required to establish high-order nonlinearity in approximating the residual error, a PPLN approximates the high-order nonlinearity by using trainable nonlinear unit activation functions (e.g., hermite polynomials). To relax the necessity of predefined smoothness of nonlinearity in a PPLN, we propose in this paper a new learning network, called a pooling projection pursuit network (PPPN), which alleviates the the critical requirement of adequate order pre-selection without suffering the regression performance degradation as often encountered in a previously proposed cascaded projection pursuit network
Keywords :
approximation theory; learning (artificial intelligence); neural nets; polynomials; smoothing methods; hermite polynomials; high-order nonlinearity approximation; neural network; pooling projection pursuit network; trainable nonlinear unit activation functions; Backpropagation; Data analysis; Degradation; Joining processes; Laboratories; Neural networks; Neurons; Polynomials; Statistical learning; Transportation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374476
Filename :
374476
Link To Document :
بازگشت