Title :
On the geometry of feedforward neural network weight spaces
Author :
Chen, An Mei ; Hecht-Nielsen, Robert
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., La Jolla, CA, USA
Abstract :
As is well known, many feedforward neural network architectures have the property that their overall input-output function is unchanged by certain weight permutations and sign flips. The existence of these properties implies that if a global optimum of the network performance surface exists at some finite weight vector position, then many copies of a global minimum can be generated by geometric weight transformations which leave the input/output function of the network unchanged. The geometric structure of these equierror weight space transformations is explored for the case of multilayer perceptron architectures with tanh squashing functions. It is shown that these transformations form a subgroup of the reflection group. The authors also show that there exists a cone in weight space which forms a minimal sufficient search set for learning. The size of this cone is established. They also show that the average distance between global minimum copies on a finite sphere known to contain a global minimum goes to zero as the number of weights in the neural network increases without bound
Keywords :
artificial intelligence; learning systems; neural nets; feedforward neural network weight spaces; geometry; global minimum; input-output function; learning; multilayer perceptron architectures; network performance; sign flips; tanh squashing functions;
Conference_Titel :
Artificial Neural Networks, 1991., Second International Conference on
Conference_Location :
Bournemouth
Print_ISBN :
0-85296-531-1