• DocumentCode
    423522
  • Title

    How chaos boosts the encoding capacity of small recurrent neural networks : learning consideration

  • Author

    Molter, Colin ; Salihoglu, Utku ; Bersini, Hugues

  • Author_Institution
    Laboratory of artificial intelligence IRIDIA, Universite Libre de Bruxelles, Brussels, Belgium
  • Volume
    1
  • fYear
    2004
  • fDate
    25-29 July 2004
  • Lastpage
    80
  • Abstract
    So far, recurrent networks, when adopting fixed point dynamics, show a very poor encoding capacity. However, these same networks, when preferentially maintained in chaotic dynamics, can encode an enormous amount of information in their cyclic attractors and this boosts their encoding capacity. It has been described in a previous paper a simple way to encode such information by robustly associating each vector in a N-dimensional space with one "symbolic" cyclic attractor. The main message was the monotonous increase of chaotic spontaneous regimes as a function of the number of attractors to learn. However, no algorithm was provided to adjust the connection\´s weight in order to encode a given input set. For this purpose, this paper revisits the classical gradient-based BPTT learning algorithm. It shows that this algorithm gives poor results and furthermore that by using it the "chaoticity" of the network dampens strongly, hence it\´s encoding capacity.
  • Keywords
    backpropagation; chaos; gradient methods; recurrent neural nets; back propagation through time; chaotic dynamics; cyclic attractors; gradient-based BPTT learning algorithm; recurrent neural networks; Artificial neural networks; Biological neural networks; Chaos; Chaotic communication; Convergence; Encoding; Laboratories; Learning; Recurrent neural networks; Switches;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-8359-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2004.1379874
  • Filename
    1379874