Title :
Distributed robust training of multilayer neural netwroks using normalized risk-averting error
Author :
Ninomiya, Hiroshi
Author_Institution :
Dept. of Inf. Sci., Shonan Inst. of Technol., Fujisawa, Japan
Abstract :
This paper describes a novel distributed quasi-Newton-based robust training using the normalized risk-averting error (NRAE) with the gradual deconvexification (GDC) strategy. The main purpose of the computation is accomplished by optimizing the NRAE criterion parallely across different computing units, thereby two big advantages such as faster computation and global convergence can be obtained. The key idea is to replace the log partition function of the NRAE with a parallelizable upper-bound based on the concavity of the log-function. As a result, it is confirmed that the method is robust, and provides high quality training solutions regardless of initial values. Furthermore, the CPU time is drastically improved by the proposed distribution method without losing the quality of solutions.
Keywords :
learning (artificial intelligence); multilayer perceptrons; GDC strategy; NRAE criterion; distributed quasiNewton-based robust training; distributed robust training; gradual deconvexification strategy; log partition function; log-function concavity; multilayer neural networks; normalized risk-averting error; parallelizable upper-bound; Approximation algorithms; Function approximation; Neural networks; Nonhomogeneous media; Optimization; Robustness; Training; distributed quasi-Newton training; gradual deconvexification; highly-nonlinear function modeling; multilayer neural networks; normalized risk-averting error;
Conference_Titel :
Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), 2014 IEEE Symposium on
Conference_Location :
Orlando, FL
DOI :
10.1109/CCMB.2014.7020706