• DocumentCode
    303249
  • Title

    Second differentials in arbitrary feedforward neural networks

  • Author

    Rossi, Fabrice

  • Author_Institution
    Thomson-CSF, Bagneux, France
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    418
  • Abstract
    We extend here a general mathematical model for feedforward neural networks. Such a network is represented as a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We have already shown that the differential of f can be computed with an extended back-propagation algorithm as well as with a direct method. In this paper, we show that the second differentials of f can also be computed with several different algorithms. Evaluating the theoretical complexities of these methods allow one to choose the fastest algorithm for a particular architecture. This will allow us to handle arbitrary feedforward neural network learning with the help of recent training and analysis techniques based on the Hessian matrix of the error
  • Keywords
    Hessian matrices; feedforward neural nets; learning (artificial intelligence); Hessian matrix; extended backpropagation algorithm; fastest algorithm; feedforward neural networks; second differentials; vectorial function; Communication system control; Computer architecture; Computer networks; Electronic mail; Feedforward neural networks; Feedforward systems; Intelligent networks; Mathematical model; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.548929
  • Filename
    548929