DocumentCode :
3285110
Title :
Principal component analysis by gradient descent on a constrained linear Hebbian cell
Author :
Chauvin, Yves
Author_Institution :
Thomson-CSF Inc., Palo Alto, CA, USA
fYear :
1989
fDate :
0-0 1989
Firstpage :
373
Abstract :
The behavior of a linear computing unit is analyzed during learning by gradient descent of a cost function equal to the sum of a variance maximization and a weight normalization term. The landscape of this cost function is shown to be composed of one local maximum, a set of saddle points, and one global minimum aligned with the principal components of the input patterns. It is possible to describe the cost landscape in terms of the hyperspheres, hypercrests, and hypervalleys associated with each of these principal components. Using this description, it is possible to show that the learning trajectory will converge to the global minimum of the landscape under certain conditions of the starting weights and learning rate of the descent procedure. Furthermore, it is possible to provide a precise description of the learning trajectory in this cost landscape. Extensions and implications of the algorithm are discussed by using networks of such cells.<>
Keywords :
learning systems; neural nets; optimisation; constrained linear Hebbian cell; cost landscape; global minimum; gradient descent; hypercrests; hyperspheres; hypervalleys; learning trajectory; linear computing unit; local maximum; neural nets; principle component analysis; saddle points; variance maximization; Learning systems; Neural networks; Optimization methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118611
Filename :
118611
Link To Document :
بازگشت