Title :
Hardware implementation of the backpropagation without multiplication
Author :
Cloutier, Jocelyn ; Simard, Patrice Y.
Author_Institution :
Dept. d´´Inf. et de Recherche Oper., Montreal Univ., Que., Canada
Abstract :
The backpropagation algorithm has been modified to work without any multiplications and to tolerate computations with a low resolution, which makes it more attractive for a hardware implementation. Numbers are represented in floating-point format with 1 bit mantissa and 2 bits in the exponent for the states, and 1 bit mantissa and 4 bit exponent for the gradients, while the weights are 16 bit fixed-point numbers. In this way, all the computations can be executed with shift and add operations. Large networks with over 100000 weights were trained and demonstrated the same performance as networks computed with full precision. An estimate of a circuit implementation shows that a large network can be placed on a single chip, reaching more than 1 billion weight updates per second. A speedup is also obtained on any machine where a multiplication is slower than a shift operation
Keywords :
backpropagation; floating point arithmetic; neural chips; 16 bit fixed-point numbers; backpropagation algorithm; floating-point format; hardware implementation; Backpropagation algorithms; Circuits; Computer architecture; Computer networks; Degradation; Equations; Hardware; Robustness; Stochastic processes;
Conference_Titel :
Microelectronics for Neural Networks and Fuzzy Systems, 1994., Proceedings of the Fourth International Conference on
Conference_Location :
Turin
Print_ISBN :
0-8186-6710-9
DOI :
10.1109/ICMNN.1994.593174