Author_Institution :
Inst. of Math. Machines & Syst., Acad. of Sci., Kiev, Ukraine
Abstract :
Learning by neural nets is achieved by adjusting interneural bond weights and (in some models) neuron threshold levels. These adjustments are carried out either iteratively, by interneural bond weight gradual slaving, or directly, by calculation using storable data values. The latter approach, due to Hopfield (1982), is noniterative. Iterative methods require reiteration of the whole storable information, which limits their practical usage. On the other hand, despite high storage speed, the Hopfield net has not gained wide popularity because of its small associative memory capacity, the impossibility of erasing out-of-date information, and loss of stability when supersaturating the memory. Personnaz et al. (1986) proposed an exact projection (pseudo-inverse) bond weight computation algorithm, which doubles the amount of storable data size and resolved the problem of Hopfield network instability. Reznik and Gorodnichy (1996, 1997) studied the effect of the synaptic matrix deformation on the pseudo-inverse networks behaviour and proposed the memory desaturation technique which allowed to double again the capacity of the network. Later, this technique lead us to the dynamic desaturation technique which made the gradual release of memory from obsolete information possible. In this work we present the results of our investigation into noniterative learning methods for neural networks
Keywords :
Hopfield neural nets; content-addressable storage; inverse problems; learning (artificial intelligence); stability; Hopfield neural net; associative memory capacity; dynamic desaturation technique; exact projection bond weight computation algorithm; interneural bond weight adjustment; memory desaturation technique; memory supersaturation; neural networks; neuron threshold level adjustment; noniterative learning; pseudo-inverse bond weight computation algorithm; stability loss; storable data values; synaptic matrix deformation; Associative memory; Biological neural networks; Bonding; Convergence; Learning systems; Machine learning; Mathematical model; Neural networks; Neurons; Stability;