• DocumentCode
    288581
  • Title

    An automated design system for finding the minimal configuration of a feed-forward neural network

  • Author

    Teng, Chin-Chi ; Wah, Benjamin W.

  • Author_Institution
    Coordinated Sci. Lab., Illinois Univ., Urbana, IL, USA
  • Volume
    3
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    1295
  • Abstract
    In this paper, we present a method for finding the minimal configuration of a feedforward artificial neural network (ANN) for solving a given application problem. We assume that the cascade-correlation (CAS) training algorithm is used to train the weights of the ANNs concerned. Under a given time constraint that is long enough to train tens of ANNs completely, we divide the time into quanta, and present a method for scheduling dynamically the ANN to be trained in each quantum from a pool of partially trained ANNs. Our goal is to find an ANN configuration with smaller number of hidden units as compared to the alternative of applying the CAS algorithm repeatedly to train each ANN to completion before exploring new ANNs. Our system is a population-based generate-and-test method that maintains a population of candidate ANNs, and that selectively train those that are predicted to require smaller configurations. Since it is difficult to predict the exact number of hidden units required when the CAS algorithm terminates, our system compares two partially trained ANNs and predicts which one will converge with a smaller number of hidden units relative to the other. Our prediction mechanism is based on a comparator neural network (CANN) that takes as inputs the TSSE-versus-time behavior of training performed already on two ANNs, and that predicts which one will require a smaller number of hidden units when convergence is reached. We show that our CANN can predict correctly most of the time, and present experimental results on better configurations found in a given time limit for a classification problem and the two-spiral problem
  • Keywords
    correlation methods; feedforward neural nets; learning (artificial intelligence); automated design system; cascade-correlation training algorithm; classification problem; comparator neural network; convergence; dynamic scheduling; feedforward neural network; minimal configuration; population-based generate-and-test method; time constraint; two-spiral problem; Artificial neural networks; Content addressable storage; Convergence; Dynamic scheduling; Feedforward neural networks; Feedforward systems; Neural networks; Spirals; Supervised learning; Time factors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374471
  • Filename
    374471