Title :
Gradient methods in unsupervised neural networks
Author :
Dajani, A.L. ; Kamel, Michel ; Elmasry, M.I.
Author_Institution :
Waterloo Univ., Ont., Canada
Abstract :
The authors discuss the application of gradient methods (steepest descent and conjugate gradients) to an unsupervised neural network that uses potential functions in the output layer as previously reported by the authors (1990). The nature of most unsupervised neural networks makes the gradient step algorithm the only possible way to optimize the network´s function. The network reported by the authors in 1990 can make use of other gradient methods. Experiments with the steepest descent/ascent and conjugate gradients algorithms show a potential decrease in the number of iterations. The contours of the network´s function showed considerable variation, especially close to the local optima. This nature restricted the line search algorithms that could be used. Simulations confirm this fact by showing how a crude line search algorithm performs better than a semiexact line search algorithm
Keywords :
iterative methods; neural nets; conjugate gradients; crude line search algorithm; gradient methods; gradient step algorithm; iterative methods; potential functions; semiexact line search algorithm; steepest descent; unsupervised neural networks; Application software; Clustering algorithms; Design engineering; Euclidean distance; Gradient methods; Intelligent networks; Neural networks; Neurons; Systems engineering and theory; Testing;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170685