Title :
Partially affine invariant training using dense transform matrices
Author :
Robinson, Melvin D. ; Manry, Michael T.
Author_Institution :
Dept. of Electr. Eng., Univ. of Texas at Arlington, Arlington, TX, USA
Abstract :
The concept of equivalent networks is reviewed as a method for testing algorithms for affine invariance. Partial affine invariance is defined and introduced to first order training through the development of linear transforms of the hidden layer´s net function vector. The resulting two-step training algorithm has convergence properties that are comparable to Levenberg-Marquardt, but with fewer multiplies per iteration.
Keywords :
affine transforms; convergence; iterative methods; multilayer perceptrons; Levenberg-Marquardt; MLP architecture; convergence properties; dense transform matrices; equivalent networks; first order training; hidden layer net function vector; iteration; linear transforms; multilayer perceptron; partial affine invariance; partially affine invariant training; two-step training algorithm; Convergence; Equations; Mathematical model; Signal processing algorithms; Training; Transforms; Vectors;
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4673-6128-6
DOI :
10.1109/IJCNN.2013.6706914