Title :
Newton´s method backpropagation for complex-valued holomorphic multilayer perceptrons
Author :
La Corte, Diana Thomson ; Yi Ming Zou
Author_Institution :
Dept. of Math. Sci., Univ. of Wisconsin-Milwaukee, Milwaukee, WI, USA
Abstract :
The study of Newton´s method in complex-valued neural networks faces many difficulties. In this paper, we derive Newton´s method backpropagation algorithms for complex-valued holomorphic multilayer perceptrons, and investigate the convergence of the one-step Newton steplength algorithm for the minimization of real-valued complex functions via Newton´s method. To provide experimental support for the use of holomorphic activation functions, we perform a comparison of using sigmoidal functions versus their Taylor polynomial approximations as activation functions by using the algorithms developed in this paper and the known gradient descent backpropagation algorithm. Our experiments indicate that the Newton´s method based algorithms, combined with the use of polynomial activation functions, provide significant improvement in the number of training iterations required over the existing algorithms.
Keywords :
Newton method; backpropagation; gradient methods; multilayer perceptrons; polynomial approximation; Newton´s method backpropagation algorithms; Taylor polynomial approximations; complex-valued holomorphic multilayer perceptrons; complex-valued neural networks; gradient descent backpropagation algorithm; holomorphic activation functions; one-step Newton steplength algorithm; real-valued complex functions; sigmoidal functions; Backpropagation algorithms; Convergence; Neural networks; Newton method; Polynomials; Training; Vectors;
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
DOI :
10.1109/IJCNN.2014.6889384