Title :
Recurrent competitive networks can learn locally excitatory topologies
Author :
Jug, Florian ; Cook, Matthew ; Steger, Angelika
Author_Institution :
Dept. of Inf., ETH Zurich, Zurich, Switzerland
Abstract :
A common form of neural network consists of spatially arranged neurons, with weighted connections between the units providing both local excitation and long-range or global inhibition. Such networks, known as soft-winner-take-all networks or lateral-inhibition type neural fields, have been shown to exhibit desirable information-processing properties including balancing the influence of compatible inputs, deciding between incompatible inputs, signal restoration from noisy, weak, or overly strong input, and the ability to be used as trainable building blocks in larger networks. However, the local excitatory connections in such a network are typically hand-wired based on a fixed spatial arrangement which is chosen using prior knowledge of the dimensionality of the data to be learned by such a network, and neuroanatomical evidence is stubbornly inconsistent with these wiring schemes. Here we present a learning rule that allows networks with completely random internal connectivity to learn the weighted connections necessary for implementing the “local” excitation used by these networks, where the locality is with respect to the inherent topology of the input received by the network, rather than being based on an arbitrarily prescribed spatial arrangement of the cells in the network. We use the Siegert approximation to leaky integrate-and-fire neurons, obtaining networks with consistently sparse activity, to which we apply standard Hebbian learning with weight normalization, plus homeostatic activity regulation to ensure full network utilization. Our results show that such networks learn appropriate excitatory connections from the input, and do not require these connections to be hand-wired with a fixed topology as they traditionally have been for decades.
Keywords :
Hebbian learning; approximation theory; biology computing; data handling; recurrent neural nets; Siegert approximation; compatible input influence balancing; data dimensionality; fixed spatial arrangement; global inhibition; homeostatic activity regulation; incompatible inputs; information-processing properties; lateral-inhibition type neural fields; leaky integrate-and-fire neurons; learning rule; local excitation; local excitatory connections; locally excitatory topologies; neural network; neuroanatomical evidence; random internal connectivity; recurrent competitive networks; signal restoration; soft-winner-take-all networks; sparse activity; standard Hebbian learning; trainable building blocks; weight normalization; weighted connections; Approximation methods; Computational modeling; Hebbian theory; Network topology; Neurons; Topology; Wiring;
Conference_Titel :
Neural Networks (IJCNN), The 2012 International Joint Conference on
Conference_Location :
Brisbane, QLD
Print_ISBN :
978-1-4673-1488-6
Electronic_ISBN :
2161-4393
DOI :
10.1109/IJCNN.2012.6252786