• DocumentCode
    253389
  • Title

    A skewed derivative activation function for SFFANNs

  • Author

    Chandra, P. ; Sodhi, Sartaj Singh

  • Author_Institution
    Sch. of Inf. & Commun. Technol., Guru Gobind Singh Indraprastha Univ., New Delhi, India
  • fYear
    2014
  • fDate
    9-11 May 2014
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    In the current paper, a new activation function is proposed for usage in constructing sigmoidal feedforward artificial neural networks. The suitability of the proposed activation function is established. The proposed activation function has a skewed derivative whereas the usually utilized activation functions derivatives are symmetric about the y-axis (as for the log-sigmoid or the hyperbolic tangent function). The efficiency and efficacy of the usage of the proposed activation function is demonstrated on six function approximation tasks. The obtained results indicate that if a network using the proposed activation function in the hidden layer, is trained then it converges to deeper minima of the error functional, generalizes better and converges faster as compared to networks using the standard log-sigmoidal activation function at the hidden layer.
  • Keywords
    feedforward neural nets; function approximation; transfer functions; SFFANNs; function approximation tasks; hyperbolic tangent function; log-sigmoidal activation function; sigmoidal feedforward artificial neural networks; skewed derivative activation function; Function approximation; Neural networks; Silicon; Standards; Training;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Recent Advances and Innovations in Engineering (ICRAIE), 2014
  • Conference_Location
    Jaipur
  • Print_ISBN
    978-1-4799-4041-7
  • Type

    conf

  • DOI
    10.1109/ICRAIE.2014.6909324
  • Filename
    6909324