• DocumentCode
    1626577
  • Title

    An M-part Sperner theorem with applications to neural networks

  • Author

    Shrivastava, Yash ; Dasgupta, Soura

  • Author_Institution
    Centre for Ind. Control Sci., Newcastle, NSW, Australia
  • Volume
    1
  • fYear
    1994
  • Firstpage
    972
  • Abstract
    Fundamental theorem of Sperner concerning the size of a family of sets unordered by set inclusion states that if the members of the family are subsets of an n-element set then the maximum size of the family is the largest binomial coefficient ([n/2]n). Further, the families of size ([n/2]n) must necessarily consist of: 1) all subsets of size n/2 if n is even, and 2) either all subsets of size (n-1)/2 or all subsets of size (n+1)/2 if n is odd. A generalization of this result is given that includes it as a special case. These results are applied to obtain a tight upper bound on the number of stationary points of Hopfield neural networks. A graph theoretic characterization of networks achieving this upper bound is also given
  • Keywords
    Hopfield neural nets; graph theory; optimisation; set theory; Hopfield neural networks; M-part Sperner theorem; binomial coefficient; graph theory; optimisation; set theory; upper bound; Australia; Cities and towns; Hopfield neural networks; Industrial control; Lakes; Neural networks; Upper bound; Virtual manufacturing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Decision and Control, 1994., Proceedings of the 33rd IEEE Conference on
  • Conference_Location
    Lake Buena Vista, FL
  • Print_ISBN
    0-7803-1968-0
  • Type

    conf

  • DOI
    10.1109/CDC.1994.410929
  • Filename
    410929