Title :
Fast neural networks without multipliers
Author :
Marchesi, Michele ; Orlandi, Gianni ; Piazza, Francesco ; Uncini, Aurelio
Author_Institution :
Dipartimento di Ingegneria Biofisica e Elettron., Genova Univ., Italy
fDate :
1/1/1993 12:00:00 AM
Abstract :
Multilayer perceptrons (MLPs) with weight values restricted to powers of two or sums of powers of two are introduced. In a digital implementation, these neural networks do not need multipliers but only shift registers when computing in forward mode, thus saving chip area and computation time. A learning procedure, based on backpropagation, is presented for such neural networks. This learning procedure requires full real arithmetic and therefore must be performed offline. Some test cases are presented, concerning MLPs with hidden layers of different sizes, on pattern recognition problems. Such tests demonstrate the validity and the generalization capability of the method and give some insight into the behavior of the learning algorithm
Keywords :
backpropagation; digital arithmetic; feedforward neural nets; backpropagation; digital arithmetic; fast neural nets; forward mode; hidden layers; learning procedure; multilayer perceptions; pattern recognition; shift registers; Arithmetic; Computer networks; Concurrent computing; Hardware; Multilayer perceptrons; Neural networks; Neurons; Testing; Ultra large scale integration; Very large scale integration;
Journal_Title :
Neural Networks, IEEE Transactions on