• DocumentCode
    856570
  • Title

    Using random weights to train multilayer networks of hard-limiting units

  • Author

    Barlett, P.L. ; Downs, Tom

  • Author_Institution
    Dept. of Electr. Eng., Queensland Univ., St. Lucia, Qld., Australia
  • Volume
    3
  • Issue
    2
  • fYear
    1992
  • fDate
    3/1/1992 12:00:00 AM
  • Firstpage
    202
  • Lastpage
    210
  • Abstract
    A gradient descent algorithm suitable for training multilayer feedforward networks of processing units with hard-limiting output functions is presented. The conventional backpropagation algorithm cannot be applied in this case because the required derivatives are not available. However, if the network weights are random variables with smooth distribution functions, the probability of a hard-limiting unit taking one of its two possible values is a continuously differentiable function. In the paper, this is used to develop an algorithm similar to backpropagation, but for the hard-limiting case. It is shown that the computational framework of this algorithm is similar to standard backpropagation, but there is an additional computational expense involved in the estimation of gradients. Upper bounds on this estimation penalty are given. Two examples which indicate that, when this algorithm is used to train networks of hard-limiting units, its performance is similar to that of conventional backpropagation applied to networks of units with sigmoidal characteristics are presented
  • Keywords
    learning systems; neural nets; estimation penalty; gradient descent algorithm; hard-limiting units; learning systems; multilayer networks; neural nets; random weights; sigmoidal characteristics; training; Computational modeling; Distribution functions; Feedforward neural networks; Hardware; Helium; Logistics; Neural networks; Nonhomogeneous media; Random variables; Upper bound;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.125861
  • Filename
    125861