• DocumentCode
    295979
  • Title

    Entropic optimum synthesis of multi-layered feed-forward ANNs

  • Author

    Pelagotti, Andrea ; Piuri, Vincenzo

  • Author_Institution
    Dept. of Electron. & Inf., Politecnico di Milano, Italy
  • Volume
    1
  • fYear
    1995
  • fDate
    Nov/Dec 1995
  • Firstpage
    253
  • Abstract
    Optimization of the neural architecture is often critical to design an efficient and feasible solution, in particular when a VLSI implementation is considered. This paper proposes an original approach to the synthesis of multilayered feed-forward ANNs based on the analysis of the information quantity flowing through the network. A layer is described as an information filter which selects the relevant characteristics until the complete classification is performed. The basic incremental method, including the training supervised procedure, is derived to design optimum (or nearly-optimum) neural paradigms. A significant variant is also proposed to improve performances
  • Keywords
    feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; neural net architecture; optimisation; VLSI implementation; basic incremental method; entropic optimum synthesis; information filter; multilayered feedforward ANNs; neural architecture; supervised training procedure; Artificial neural networks; Computer networks; Design optimization; Feedforward systems; Information analysis; Information filters; Integrated circuit interconnections; Network synthesis; Neurons; Very large scale integration;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1995. Proceedings., IEEE International Conference on
  • Conference_Location
    Perth, WA
  • Print_ISBN
    0-7803-2768-3
  • Type

    conf

  • DOI
    10.1109/ICNN.1995.488104
  • Filename
    488104