• DocumentCode
    395170
  • Title

    Efficient subspace learning using a large scale neural network CombNet-II

  • Author

    Ghaibeh, A. Ammar ; Kuroyanagi, Susumu ; Iwata, Akira

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Nagoya Inst. of Technol., Japan
  • Volume
    1
  • fYear
    2002
  • fDate
    18-22 Nov. 2002
  • Firstpage
    447
  • Abstract
    In the field of artificial neural networks, large-scale classification problems are still challenging due to many obstacles such as local minima state, long time computation, and the requirement of large amount of memory. The large-scale network CombNET-II overcomes the local minima state and proves to give good recognition rate in many applications. However CombNET-II still requires a large amount of memory used for the training database and feature space. We propose a revised version of CombNET-II with a considerably lower memory requirement, which makes the problem of large-scale classification more tractable. The memory reduction is achieved by adding a preprocessing stage at the input of each branch network. The purpose of this stage is to select the different features that have the most classification power for each subspace generated by the stem network. Testing our proposed model using Japanese kanji characters shows that the required memory might be reduced by almost 50% without significant decrease in the recognition rate.
  • Keywords
    learning (artificial intelligence); neural nets; optical character recognition; pattern classification; CombNET-II; Japanese kanji characters; artificial neural networks; large scale neural network; large-scale classification; large-scale classification problems; large-scale network; local minima state; memory reduction; preprocessing stage; recognition rate; subspace learning; time computation; Artificial neural networks; Character recognition; Computer networks; Data preprocessing; Large-scale systems; Mars; Multi-layer neural network; Neural networks; Spatial databases; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
  • Print_ISBN
    981-04-7524-1
  • Type

    conf

  • DOI
    10.1109/ICONIP.2002.1202210
  • Filename
    1202210