• DocumentCode
    2053522
  • Title

    Using NCL, an effective way to improve combination methods of neural classifiers

  • Author

    Iranzad, Arash ; Masoudnia, Saeed ; Cheraghchi, Fatemeh ; Nowzari, Abbas ; Ebrahimpour, Reza

  • Author_Institution
    Comput. Sci. Dept., Univ. of Tehran, Tehran, Iran
  • fYear
    2010
  • fDate
    7-10 Dec. 2010
  • Firstpage
    309
  • Lastpage
    313
  • Abstract
    This paper investigates the effect of diversity caused by Negative Correlation Learning (NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments. Utilizing NCL for diversifying the base classifiers leads to significantly better results in all employed combining methods. Experimental results on five datasets from UCI repository indicate that by employing NCL, the performance of the ensemble structure can be more favorable compared to that of an ensemble use independent base classifiers.
  • Keywords
    generalisation (artificial intelligence); pattern classification; classifier combination method; decision template; negative correlation learning; neural classifier; nontrainable combining method; stacked generalization; Artificial neural networks; Classification algorithms; Correlation; Diversity reception; Pattern recognition; Sonar; Training; Averaging; Clssifier Combination; Decision Templates; Negative Correlation Learning; Stacked Generalization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Soft Computing and Pattern Recognition (SoCPaR), 2010 International Conference of
  • Conference_Location
    Paris
  • Print_ISBN
    978-1-4244-7897-2
  • Type

    conf

  • DOI
    10.1109/SOCPAR.2010.5686642
  • Filename
    5686642