• DocumentCode
    1748947
  • Title

    A novel adaptive activation function

  • Author

    Xu, Shuxiang ; Zhang, Ming

  • Author_Institution
    Sch. of Comput., Tasmania Univ., Launceston, Tas., Australia
  • Volume
    4
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    2779
  • Abstract
    This paper deals with an experimental justification of a novel adaptive activation function for feedforward neural networks (FNNs). Simulation results reveal that FNNs with the proposed adaptive activation function present several advantages over traditional neuron-fixed feedforward networks such as much reduced network size, faster learning, and lessened approximation errors. Following the definition of the neuron-adaptive activation function, we conduct experiments with function approximation and financial data simulation, and depict the experimental outcomes that exhibit the advantages of FNN with our neuron-adaptive activation function over traditional FNN with fixed activation function
  • Keywords
    feedforward neural nets; financial data processing; function approximation; learning (artificial intelligence); transfer functions; adaptive activation function; feedforward neural networks; financial data processing; function approximation; learning; Artificial intelligence; Australia; Bismuth; Computational modeling; Computer networks; Electronic mail; Feedforward neural networks; Feedforward systems; Function approximation; Neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.938813
  • Filename
    938813