Title :
Least MSE reconstruction for self-organization. II. Further theoretical and experimental studies on one-layer nets
Author_Institution :
Dept. of Math., Peking Univ., Beijing, China
Abstract :
For pt.I, see ibid., pp. 2362-2367 (1991). The one-layer case of a new self-organizing network is studied. The net is based on using the least mean square error reconstruction (LMSER) principle for guiding learning which results in a local learning rule denoted by LMSER. It is proven that for one layer with n1 linear units, the LMSER rule will let their weight vectors converge to some rotations of the first n1 principal components of the input data. These converged points are stable and correspond to the global (flat) minimum of the landscape of MSE of reconstruction. The landscape has no other local minimum points but many saddle points. It is also proven that in the average sense the evolution direction of the subspace rules of E. Oja (1989) has a positive projection on the evolution direction of LMSER. The connection produces several new results about the Oja subspace rule. One previously undiscovered role of sigmoid nonlinearity is revealed. Due to this nonlinearity, the competition and cooperation automatically emerge during the learning when using LMSER. As a result, each of these units becomes a selective unit with its weight vector along the direction of one of the first n1 principal components
Keywords :
learning systems; neural nets; self-adjusting systems; competition; cooperation; evolution direction; least mean square error reconstruction; local learning rule; one-layer nets; self-organizing network; sigmoid nonlinearity; Computer science; Covariance matrix; Data compression; Feature extraction; Learning systems; Mathematics; Mean square error methods; Pattern clustering; Principal component analysis; Vector quantization;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170742