DocumentCode :
3707658
Title :
SWAP-NODE: A regularization approach for deep convolutional neural networks
Author :
Takayoshi Yamashita;Masayuki Tanaka;Yuji Yamauchi;Hironobu Fujiyoshi
Author_Institution :
Chubu University 1200 Matsumoto-cho, Kasugai, Aichi, Japan
fYear :
2015
Firstpage :
2475
Lastpage :
2479
Abstract :
The regularization is important for training of a deep network. One of breakthrough approach is dropout. It randomly deletes a certain number of activations in each layer in the feed-forward step of the training process. The dropout significantly reduces an effect of over-fitting and improves test performance. We introduce a new regularization approach for deep learning, called the swap-node. The swap-node, which is applied to a fully connected layer, swaps the activation values of two nodes randomly selected with a certain probability. Empirical evaluation shows that the network using the swap-node performs the best on MNIST, CIFAR-10, and SVHN. We also demonstrate superior performance of a combination of the swap-node and dropout on these datasets.
Keywords :
"Training","Error analysis","Training data","Convolution","Neural networks","Machine learning","Computer vision"
Publisher :
ieee
Conference_Titel :
Image Processing (ICIP), 2015 IEEE International Conference on
Type :
conf
DOI :
10.1109/ICIP.2015.7351247
Filename :
7351247
Link To Document :
بازگشت