Title :
Explicit solutions of the optimum weights of layered neural networks
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing, China
Abstract :
It is shown that, if the hidden layer units take a sinusoidal activation function, the optimum weights of the three-layer feedforward neural network can be explicitly solved by relating the layered neural network to a truncated Fourier series expansion. Based on this result, two approaches are presented, one of which is suited to the case when detailed statistical information is available or can be easily estimated. The other is of the data-adaptive type, which can be treated as a solution of a standard least-squares. The latter is best suited to real-time processing and slowly time-varying applications since it can be straightforwardly implemented by the traditional LMS or RLS adaptive algorithms. It is also shown that, for both approaches, the resulting networks have the ability of forming arbitrary mappings. By using the present approaches, the conventional training procedure, which is usually very time-consuming, can be avoided
Keywords :
feedforward neural nets; learning (artificial intelligence); least squares approximations; series (mathematics); arbitrary mappings; data-adaptive type; explicit solutions; feedforward neural network; hidden layer units; layered neural networks; optimum weights; sinusoidal activation function; standard least-squares; statistical information; truncated Fourier series expansion; Feedforward neural networks; Fourier series; Joining processes; Neural networks; Tin;
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
DOI :
10.1109/IJCNN.1992.287102