Author/Authors :
Xu, Shipu Department of Software Engineering - Tongji University - Shanghai, China , Li, Runlong Department of Railway Transportation - Shanghai Institute of Technology - Shanghai, China , Wang, Yunsheng Shanghai Academy of Agricultural Sciences - Shanghai, China , Liu, Yong Shanghai Academy of Agricultural Sciences - Shanghai, China , Hu, Wenwen Shanghai Academy of Agricultural Sciences - Shanghai, China , Wu, Yingjing Shanghai Academy of Agricultural Sciences - Shanghai, China , Zhang, Chenxi Department of Software Engineering - Tongji University - Shanghai, China , Liu, Chang School of Information Engineering - Nanchang Hangkong University - Nanchang Jiangxi, China , Ma, Chao Shanghai Academy of Agricultural Sciences - Shanghai, China
Abstract :
With the increasing of depth and complexity of the convolutional neural network, parameter dimensionality and volume of
computing have greatly restricted its applications. Based on the SqueezeNet network structure, this study introduces a block
convolution and uses channel shuffle between blocks to alleviate the information jam. The method is aimed at reducing the
dimensionality of parameters of in an original network structure and improving the efficiency of network operation. The
verification performance of the ORL dataset shows that the classification accuracy and convergence efficiency are not reduced or
even slightly improved when the network parameters are reduced, which supports the validity of block convolution in structure
lightweight. Moreover, using a classic CIFAR-10 dataset, this network decreases parameter dimensionality while accelerating
computational processing, with excellent convergence stability and efficiency when the network accuracy is only reduced by 1.3%.