• DocumentCode
    1621176
  • Title

    Training MLPs via the expectation maximization algorithm

  • Author

    Cook, G.D. ; Robinson, A.J.

  • Author_Institution
    Cambridge Univ., UK
  • fYear
    1995
  • Firstpage
    47
  • Lastpage
    52
  • Abstract
    Two factors that inhibit the use of multi layer perceptrons (MLP) by the pattern recognition and statistical modelling community are the long training times and a poorly developed theory of network internals. The paper proposes a solution to these problems by introducing a framework in which the hidden nodes of a multi layer perceptron are models of a binary stochastic process governed by the Bernoulli probability density function. Within this framework, we present an expectation maximisation (EM) algorithm for training the network. The algorithm iteratively finds the most probable internal representation given the network inputs and desired outputs. The internal representation forms target values for the hidden nodes so that the desired output of all nodes in the network is specified. This removes the necessity of using back propagation of errors to modify hidden layer weights, and effectively decomposes the MLP into two single layer perceptrons, the parameters of which can be efficiently optimised. We demonstrate the use of our algorithm on a number of binary and multiway classification tasks
  • Keywords
    iterative methods; learning (artificial intelligence); multilayer perceptrons; probability; stochastic processes; Bernoulli probability density function; MLP training; binary stochastic process; desired outputs; expectation maximization algorithm; hidden nodes; multi layer perceptrons; multiway classification tasks; network inputs; network internals; pattern recognition; probable internal representation; single layer perceptrons; statistical modelling;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Artificial Neural Networks, 1995., Fourth International Conference on
  • Conference_Location
    Cambridge
  • Print_ISBN
    0-85296-641-5
  • Type

    conf

  • DOI
    10.1049/cp:19950527
  • Filename
    497789