• DocumentCode
    303371
  • Title

    A recurrent network with stochastic weights

  • Author

    Zhao, Jieyu ; Shawe-Taylor, John

  • Author_Institution
    IDSIA, Lugano, Switzerland
  • Volume
    2
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    1302
  • Abstract
    Stochastic neural networks for global optimization are usually built by introducing random fluctuations into the network. A natural method is to use stochastic weights rather than stochastic activation functions. We propose a new model in which each neuron has very simple functionality but all the weights are stochastic. It is shown that the stationary distribution of the network uniquely exists and it is approximately a Boltzmann-Gibbs distribution when the size of the network is not too small. A new technique to implement simulated annealing is proposed. Simulation results on the graph bisection problem show that the power of the network is comparable with that of a Boltzmann machine
  • Keywords
    Boltzmann machines; graph theory; recurrent neural nets; simulated annealing; stochastic processes; Boltzmann-Gibbs distribution; global optimization; graph bisection; recurrent network; simulated annealing; stochastic bit stream neuron; stochastic connected Boltzmann machine; stochastic neural networks; stochastic weights; Binary sequences; Computer science; Encoding; Fault tolerance; Fluctuations; Fuzzy neural networks; Neural networks; Neurons; Simulated annealing; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.549086
  • Filename
    549086