• DocumentCode
    1126565
  • Title

    Regularization parameter estimation for feedforward neural networks

  • Author

    Guo, Ping ; Lyu, Michael R. ; Chen, C. L Philip

  • Author_Institution
    Dept. of Comput. Sci., Beijing Normal Univ., China
  • Volume
    33
  • Issue
    1
  • fYear
    2003
  • fDate
    2/1/2003 12:00:00 AM
  • Firstpage
    35
  • Lastpage
    44
  • Abstract
    Under the framework of the Kullback-Leibler (KL) distance, we show that a particular case of Gaussian probability function for feedforward neural networks (NNs) reduces into the first-order Tikhonov regularizer. The smooth parameter in kernel density estimation plays the role of regularization parameter. Under some approximations, an estimation formula is derived for estimating regularization parameters based on training data sets. The similarity and difference of the obtained results are compared with other work. Experimental results show that the estimation formula works well in sparse and small training sample cases.
  • Keywords
    feedforward neural nets; learning (artificial intelligence); parameter estimation; probability; Gaussian probability function; Kullback-Leibler distance; feedforward neural networks; first-order Tikhonov regularizer; kernel density estimation; regularization parameter estimation; smooth parameter; training data sets; Computer science; Councils; Density functional theory; Feedforward neural networks; Kernel; Neural networks; Neurons; Parameter estimation; Smoothing methods; Training data;
  • fLanguage
    English
  • Journal_Title
    Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1083-4419
  • Type

    jour

  • DOI
    10.1109/TSMCB.2003.808176
  • Filename
    1167352