• DocumentCode
    1748800
  • Title

    Relation between weight initialization of neural networks and pruning algorithms: case study on Mackey-Glass time series

  • Author

    Wan, Weishui ; Hirasawa, Kotaro ; Hu, Jinglu ; Murata, Junichi

  • Author_Institution
    Intelligent Control Lab., Kyushu Univ., Fukuoka, Japan
  • Volume
    3
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    1750
  • Abstract
    The implementation of weight initialization is directly related to the convergence of learning algorithms. We made a case study on the Mackey-Glass time series problem in order to try to find some relations between weight initialization of neural networks and pruning algorithms. The pruning algorithm used in simulations is the Laplace regularizer method, that is, the backpropagation algorithm with Laplace regularizer added to the criterion function. Simulation results show that different kinds of initialization weight matrices display almost the same generalization ability when using the pruning algorithm, at least for the Mackey-Glass time series
  • Keywords
    Gaussian distribution; backpropagation; convergence; neural nets; time series; Laplace regularizer method; Mackey-Glass time series; backpropagation algorithm; convergence; criterion function; generalization ability; initialization weight matrices; learning algorithms; neural networks; pruning algorithms; weight initialization; Backpropagation algorithms; Computer aided software engineering; Concrete; Convergence; Data mining; Glass; Information science; Intelligent control; Laboratories; Neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.938426
  • Filename
    938426