• DocumentCode
    285228
  • Title

    Improved generalization using robust cost functions

  • Author

    Joines, Jeff A. ; White, Mark W.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., North Carolina State Univ., Raleigh, NC, USA
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    911
  • Abstract
    The authors present several strategies for improving the overall generalization obtained by the normal backpropagation algorithm when there are errors such as noise and/or irrelevant inputs in the training set. When the training set is noisy and small, certain inputs or patterns are wrong. Backpropagation and least squares can generate bad curves because they attempt to find curves that fit both the patterns that have errors in them and those that do not. These curves are usually far from being representative of the true population. The idea of generalization is to find the curve that best fits the true underlying population and not the training set. These strategies include using several robust cost functions that eliminate the effect the errors have over the training process
  • Keywords
    backpropagation; generalisation (artificial intelligence); neural nets; backpropagation algorithm; generalization; least squares; noise; robust cost functions; training set; Computer errors; Cost function; Curve fitting; Error correction; Least squares approximation; Least squares methods; Neural networks; Noise robustness; Regression analysis;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227083
  • Filename
    227083