• DocumentCode
    3696894
  • Title

    Denoising AutoEncoder in Neural Networks with Modified Elliott Activation Function and Sparsity-Favoring Cost Function

  • Author

    Hasham Burhani;Wenying Feng;Gongzhu Hu

  • Author_Institution
    Depts. of Comput. &
  • fYear
    2015
  • fDate
    7/1/2015 12:00:00 AM
  • Firstpage
    343
  • Lastpage
    348
  • Abstract
    Neural networks (NN) are architectures and algorithms for machine learning. They are quite powerful for tasks like classification, clustering, and pattern recognition. Large neural networks can be considered a universal function that can approximate any function, and hence are effective at learning from the training data, not only the useful information but also the noise in the training data. However, as the number of neurons and the number of hidden layers grow, the number of connections in the network increases exponentially, and the over fitting problem becomes more severe biased towards noise. Various methods have been proposed to address this problem such as AutoEncoder, Dropout, DropConnect, and Factored Mean training. In this paper, we propose a denoising autoencoder approach using a modified Elliott activation function and a cost function that favors sparsity in the input data. Preliminary experiments using the modified algorithm on several real data sets showed that the proposed approach performed well.
  • Keywords
    "Neurons","Biological neural networks","Training","Noise reduction","Cost function","Machine learning algorithms"
  • Publisher
    ieee
  • Conference_Titel
    Applied Computing and Information Technology/2nd International Conference on Computational Science and Intelligence (ACIT-CSI), 2015 3rd International Conference on
  • Type

    conf

  • DOI
    10.1109/ACIT-CSI.2015.67
  • Filename
    7336086