Author_Institution :
Department of Electrical Engineering-Systems, University of Southern California, Los Angeles, CA 90089-0781, U.S.A
Abstract :
A special class of feedforward neural networks, referred to as structured networks, has recently been introduced as a method for solving matrix algebra problems in an inherently parallel formulation. In this paper we present a convergence analysis for the training of structured networks. Since the learning techniques that are used in structured networks are the same as the ones employed in training of neural networks, the issue of convergence is discussed not only from a numerical perspective but also as a means of deriving insight into connectionist learning. In our analysis, we develop bounds on the learning rate, under which, we prove exponential convergence of the weights to their correct values for a class of matrix algebra problems that includes linear equation solving, matrix inversion and Lyapunov equation solving. For a special class of problems we introduce, what we call, the orthogonalised backpropagation algorithm, an optimal recursive update law for minimising a least-squares cost functional, that guarantees exact convergence in one epoch. Several learning issues, such as normalizing techniques, persistency of excitation, input scaling and non-unique solution sets, are investigated.