• DocumentCode
    286750
  • Title

    Improved generalization and network pruning using adaptive Laplace regularization

  • Author

    Williams, P.M.

  • Author_Institution
    Sussex Univ., UK
  • fYear
    1993
  • fDate
    25-27 May 1993
  • Firstpage
    76
  • Lastpage
    80
  • Abstract
    Neural networks designed for regression or classification need to be trained using some form of stabilization or regularization if they are to generalize well beyond the original training set. This means finding a balance between complexity of the network and information content of the data. This paper examines a type of formal regularization in which the penalty term is proportional to the logarithm of the L p norm of the weight vector log (Σjj|p)1p/ ( p⩾1). The specific choice p=1 simultaneously provides both forms of stabilization with radical pruning leading to greatly improved generalization
  • Keywords
    estimation theory; learning (artificial intelligence); neural nets; optimisation; Bayesian estimation; adaptive Laplace regularization; complexity; generalization; information content; network pruning; penalty term; weight vector;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Artificial Neural Networks, 1993., Third International Conference on
  • Conference_Location
    Brighton
  • Print_ISBN
    0-85296-573-7
  • Type

    conf

  • Filename
    263253