Title :
Fixed-point feedforward deep neural network design using weights +1, 0, and −1
Author :
Kyuyeon Hwang ; Wonyong Sung
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Seoul Nat. Univ., Seoul, South Korea
Abstract :
Feedforward deep neural networks that employ multiple hidden layers show high performance in many applications, but they demand complex hardware for implementation. The hardware complexity can be much lowered by minimizing the word-length of weights and signals, but direct quantization for fixed-point network design does not yield good results. We optimize the fixed-point design by employing backpropagation based retraining. The designed fixed-point networks with ternary weights (+1, 0, and -1) and 3-bit signal show only negligible performance loss when compared to the floating-point coun-terparts. The backpropagation for retraining uses quantized weights and fixed-point signal to compute the output, but utilizes high precision values for adapting the networks. A character recognition and a phoneme recognition examples are presented.
Keywords :
backpropagation; character recognition; feedforward neural nets; fixed point arithmetic; signal processing; 3-bit signal; backpropagation based retraining; character recognition; fixed-point feedforward deep neural network design; fixed-point signal; hardware complexity; phoneme recognition; quantized weights; ternary weights; Backpropagation; Error analysis; Feedforward neural networks; Hardware; Quantization (signal); Training;
Conference_Titel :
Signal Processing Systems (SiPS), 2014 IEEE Workshop on
Conference_Location :
Belfast
DOI :
10.1109/SiPS.2014.6986082