Title :
Do Hebbian synapses estimate entropy?
Author :
Erdogmus, Deniz ; Principe, Jose C. ; Hild, Kenneth E., II
Author_Institution :
Dept. of Electr. & Comput. Eng., Florida Univ., Gainesville, FL, USA
Abstract :
Hebbian learning is one of the mainstays of biologically inspired neural processing. Hebb´s (1949) rule is biologically plausible, and it has been extensively utilized in both computational neuroscience and in unsupervised training of neural systems. In these fields, Hebbian learning became synonymous for correlation learning. But it is known that correlation is a second order statistic of the data, so it is sub-optimal when the goal is to extract as much information as possible from the sensory data stream. We demonstrate how information learning can be implemented using Hebb´s rule. Thus the paper brings a new understanding to how neural systems could, through Hebb´s rule, extract information theoretic quantities rather than merely correlation.
Keywords :
Hebbian learning; correlation methods; entropy; neural nets; unsupervised learning; Hebb´s rule; Hebbian learning; Hebbian synapses; biologically inspired neural processing; computational neuroscience; correlation learning; entropy estimation; information learning; information theoretic quantities extraction; neural systems; nonparametric entropy estimator; second order statistic; sensory data stream; unsupervised training; Biological information theory; Biological neural networks; Biology computing; Data mining; Entropy; Hebbian theory; Information processing; Information theory; Neurons; Statistics;
Conference_Titel :
Neural Networks for Signal Processing, 2002. Proceedings of the 2002 12th IEEE Workshop on
Print_ISBN :
0-7803-7616-1
DOI :
10.1109/NNSP.2002.1030031