• DocumentCode
    231989
  • Title

    A recurrent neural network with a tunable activation function for solving k-winners-take-all

  • Author

    Peng Miao ; Yanjun Shen ; Jianshu Hou ; Yi Shen

  • Author_Institution
    Coll. of Sci., China Three Gorges Univ., Yichang, China
  • fYear
    2014
  • fDate
    28-30 July 2014
  • Firstpage
    4957
  • Lastpage
    4962
  • Abstract
    In this paper, a finite time recurrent neural network with a tunable activation function is presented to solve the k-winners-take-all problem. The activation function has two tunable parameters which give more flexibility to design neural network. By Lyapunov theorem, the proposed neural network model can converge to the equilibrium point in finite time. Comparing with the existing neural networks, the faster convergence speed can be obtained. Particularly, proposed neural network has high robustness against noise. The effectiveness of our methods is validated by theoretical analysis and numerical simulations.
  • Keywords
    Lyapunov methods; recurrent neural nets; Lyapunov theorem; equilibrium point; finite time recurrent neural network; k-winners-take-all problem solving; neural network design; numerical simulations; tunable activation function; tunable parameters; Educational institutions; Electronic mail; Equations; Mathematical model; Numerical models; Recurrent neural networks; finite-time stability; k-winners-take-all; recurrent neural network; tunable activation function;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Control Conference (CCC), 2014 33rd Chinese
  • Conference_Location
    Nanjing
  • Type

    conf

  • DOI
    10.1109/ChiCC.2014.6895781
  • Filename
    6895781