• DocumentCode
    274147
  • Title

    Building symmetries into feedforward networks

  • Author

    Shawe-Taylor, J.

  • Author_Institution
    R. Holloway & Bedford New Coll., London Univ., UK
  • fYear
    1989
  • fDate
    16-18 Oct 1989
  • Firstpage
    158
  • Lastpage
    162
  • Abstract
    One of the central tools developed by M. Minsky and S. Papert (1988) was the group invariance theorem. This theorem is concerned with choosing perceptron weights to recognise a predicate that is invariant under a group of permutations of the input. The theorem states that the weights can be chosen to be constant for equivalence classes of predicates under the action of the group. This paper presents this result in a graph theoretic light and then extends consideration to multilayer perceptrons. It is shown that, by choosing a multilayer network in such a way that the action of the group on the input nodes can be extended to the whole network, the invariance of the output under the action of the group can be guaranteed. This greatly reduces the number of degrees of freedom in the training of such a network. An example of using this technique to train a network to recognise isomorphism classes of graphs is given. This compares favourably with previous experiments using standard back-propagation. The connections between the group of symmetries and the network structure are explored and the relation to the problem of graph isomorphism is discussed
  • Keywords
    graph theory; neural nets; equivalence classes; feedforward networks; graph isomorphism class recognition; group invariance theorem; multilayer perceptrons; perceptron weights;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Artificial Neural Networks, 1989., First IEE International Conference on (Conf. Publ. No. 313)
  • Conference_Location
    London
  • Type

    conf

  • Filename
    51951