• DocumentCode
    1909367
  • Title

    Backpropagation for linearly-separable patterns: A detailed analysis

  • Author

    Rasconi, Paolo F. ; Gori, Marco ; Tesi, Albert0

  • Author_Institution
    Dipartimento di Sistemi e Inf., Florence Univ., Italy
  • fYear
    1993
  • fDate
    1993
  • Firstpage
    1818
  • Abstract
    A sufficient condition for learning without local minima in multilayered networks is proposed. A fundamental assumption on the network architecture is removed. It is proved that the conclusions drawn by M. Gori and A. Tesi (IEEE Trans. Pattern Anal. Mach. Intell., vol.14, no.1, pp.76-86, (1992)) also hold provided that the weight matrix associated with the hidden and output layer is pyramidal and has full rank. The analysis is carried out by using least mean squares (LMS)-threshold cost functions, which allow the identification of spurious and structural local minima
  • Keywords
    backpropagation; feedforward neural nets; matrix algebra; backpropagation; learning; least mean squares; local minima; multilayered networks; neural nets; sufficient condition; threshold cost functions; weight matrix; Algorithm design and analysis; Backpropagation algorithms; Cost function; Electronic mail; Interpolation; Joining processes; Neurons; Pattern analysis; Shape; Sufficient conditions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993., IEEE International Conference on
  • Conference_Location
    San Francisco, CA
  • Print_ISBN
    0-7803-0999-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1993.298833
  • Filename
    298833