• DocumentCode
    671574
  • Title

    Partially affine invariant training using dense transform matrices

  • Author

    Robinson, Melvin D. ; Manry, Michael T.

  • Author_Institution
    Dept. of Electr. Eng., Univ. of Texas at Arlington, Arlington, TX, USA
  • fYear
    2013
  • fDate
    4-9 Aug. 2013
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    The concept of equivalent networks is reviewed as a method for testing algorithms for affine invariance. Partial affine invariance is defined and introduced to first order training through the development of linear transforms of the hidden layer´s net function vector. The resulting two-step training algorithm has convergence properties that are comparable to Levenberg-Marquardt, but with fewer multiplies per iteration.
  • Keywords
    affine transforms; convergence; iterative methods; multilayer perceptrons; Levenberg-Marquardt; MLP architecture; convergence properties; dense transform matrices; equivalent networks; first order training; hidden layer net function vector; iteration; linear transforms; multilayer perceptron; partial affine invariance; partially affine invariant training; two-step training algorithm; Convergence; Equations; Mathematical model; Signal processing algorithms; Training; Transforms; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2013 International Joint Conference on
  • Conference_Location
    Dallas, TX
  • ISSN
    2161-4393
  • Print_ISBN
    978-1-4673-6128-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.2013.6706914
  • Filename
    6706914