Title :
Backpropagation without multiplier for multilayer neural networks
Author :
Marchesi, M.L. ; Piazza, F. ; Uncini, A.
Author_Institution :
Dipartimento di Ingegneria Biofiscia ed Elettronica, Genoa Univ., Italy
fDate :
8/1/1996 12:00:00 AM
Abstract :
When multilayer neural networks are implemented with digital hardware, which allows full exploitation of the well developed digital VLSI technologies, the multiply operations in each neuron between the weights and the inputs can create a bottleneck in the system, because the digital multipliers are very demanding in terms of time or chip area. For this reason, the use of weights constrained to be power-of-two has been proposed in the paper to reduce the computational requirements of the networks. In this case, because one of the two multiplier operands is a power-of-two, the multiple operation can be performed as a much simpler shift operation on the neuron input. While this approach greatly reduces the computational burden of the forward phase of the network, the learning phase, performed using the traditional backpropagation procedure, still requires many regular multiplications. In the paper, a new learning procedure, based on the power-of-two approach, is proposed that can be performed using only shift and add operations, so that both the forward and learning phases of the network can be easily implemented with digital hardware
Keywords :
VLSI; backpropagation; computational complexity; multilayer perceptrons; neural net architecture; VLSI; backpropagation; bottleneck; computational requirements; digital multipliers; learning algorithms; multilayer neural networks; multiply operations; power-of-two representation; shift and add operations; systems engineering;
Journal_Title :
Circuits, Devices and Systems, IEE Proceedings -
DOI :
10.1049/ip-cds:19960336