• DocumentCode
    144489
  • Title

    Dynamic Growth of Hidden-Layer Neurons Using the Non-extensive Entropy

  • Author

    Susan, Seba ; Dwivedi, Monika

  • Author_Institution
    Dept. of Inf. Technol., Delhi Technol. Univ., New Delhi, India
  • fYear
    2014
  • fDate
    7-9 April 2014
  • Firstpage
    491
  • Lastpage
    495
  • Abstract
    In this paper we present a dynamic neural network that dynamically grows the number of the hidden-layer neurons based on an increase in the entropy of the weights during training. The weights are normalized to probability values prior to the computation of the entropy. The entropy being referred is the non-extensive entropy proposed recently by Susan and Hanmandlu for the representation of structured data. Incrementally growing the hidden layer as per requirement leads to better tuning of network weights and high classification performance as proved by the empirical results.
  • Keywords
    entropy; neural nets; pattern classification; probability; classification performance; dynamic neural network; hidden-layer neurons dynamic growth; network weight tuning; nonextensive entropy; probability values; structured data representation; Algorithm design and analysis; Biological neural networks; Cybernetics; Entropy; Feedforward neural networks; Neurons; Training; Dynamic Neural Network; Dynamic growth of neurons; Hidden layer neurons; Multi-layer perceptron; Non-extensive entropy; Susan and Hanmandlu entropy; Weighted sum of non-extensive entropies;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Communication Systems and Network Technologies (CSNT), 2014 Fourth International Conference on
  • Conference_Location
    Bhopal
  • Print_ISBN
    978-1-4799-3069-2
  • Type

    conf

  • DOI
    10.1109/CSNT.2014.104
  • Filename
    6821445