• DocumentCode
    3208128
  • Title

    Global optimization methods for designing and training neural networks

  • Author

    Yamazaki, Akio ; Ludermir, Teresa B. ; De Souto, Marcílio C P

  • Author_Institution
    Center of Informatics, Univ. Fed. de Pernambuco, Recife, Brazil
  • fYear
    2002
  • fDate
    2002
  • Firstpage
    136
  • Lastpage
    141
  • Abstract
    This paper shows results of two approaches for the optimization of neural networks: one uses simulated annealing for optimizing both architectures and weights combined with backpropagation for fine tuning, while the other uses tabu search for the same purpose. Both approaches generate networks with good generalization performance (mean classification error of 1.68% for simulated annealing and 0.64% for tabu search) and low complexity (mean number of connections of 11.15 out of 36 for simulated annealing and 11.62 out of 36 for tabu search) for an odor recognition task in an artificial nose.
  • Keywords
    backpropagation; generalisation (artificial intelligence); neural nets; pattern classification; search problems; simulated annealing; artificial nose; backpropagation; generalization; neural networks; odor recognition; optimization; simulated annealing; tabu search; Artificial neural networks; Backpropagation; Costs; Design methodology; Informatics; Information processing; Neural networks; Nose; Optimization methods; Simulated annealing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2002. SBRN 2002. Proceedings. VII Brazilian Symposium on
  • Print_ISBN
    0-7695-1709-9
  • Type

    conf

  • DOI
    10.1109/SBRN.2002.1181455
  • Filename
    1181455