Title :
Trial-and-error correlation learning
Author_Institution :
NTT LSI Lab., Kanagawa, Japan
fDate :
7/1/1993 12:00:00 AM
Abstract :
A new learning architecture is proposed for hardware implementation of neural networks. In this architecture, each synaptic weight is intentionally changed for each trial and then modified in proportion to the trial-and-error correlation between the changes in the weight and the total output error. If the weight changes are small, this learning is almost as good as the backpropagation (BP) learning, without requiring a complex backward network for error backpropagation. If the changes are large, the weights can move in the weight space without being restricted to a relatively small local-minimum. Computer simulation shows that this learning surpasses BP learning in converging to the global minimum when the trial-and-error correlation is defined so as to emphasize the gain (i.e., the decrease in the total output error) rather than the loss
Keywords :
correlation methods; learning (artificial intelligence); neural nets; global minimum; learning architecture; neural networks; synaptic weight; trial-and-error correlation; Amplitude modulation; Computer errors; Computer simulation; Network synthesis; Neural network hardware; Neural networks; Optimization methods; Signal synthesis; Spectroscopy; Stochastic processes;
Journal_Title :
Neural Networks, IEEE Transactions on