DocumentCode :
785912
Title :
Dually Optimal Neuronal Layers: Lobe Component Analysis
Author :
Weng, Juyang ; Luciw, Matthew
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., East Lansing, MI
Volume :
1
Issue :
1
fYear :
2009
fDate :
5/1/2009 12:00:00 AM
Firstpage :
68
Lastpage :
85
Abstract :
Development imposes great challenges. Internal ldquocorticalrdquorepresentations must be autonomously generated from interactive experiences. The eventual quality of these developed representations is of course important. Additionally, learning must be as fast as possible-to quickly derive better representation from limited experiences. Those who achieve both of these will have competitive advantages. We present a cortex-inspired theory called lobe component analysis (LCA) guided by the aforementioned dual criteria. A lobe component represents a high concentration of probability density of the neuronal input space. We explain how lobe components can achieve a dual-spatiotemporal (ldquobestrdquo and ldquofastestrdquo)-optimality, through mathematical analysis, in which we describe how lobe components plasticity can be temporally scheduled to take into account the history of observations in the best possible way. This contrasts with using only the last observation in gradient-based adaptive learning algorithms. Since they are based on two cell-centered mechanisms-Hebbian learning and lateral inhibition-lobe components develop in-place, meaning every networked neuron is individually responsible for the learning of its signal-processing characteristics within its connected network environment. There is no need for a separate learning network. We argue that in-place learning algorithms will be crucial for real-world large-size developmental applications due to their simplicity, low computational complexity, and generality. Our experimental results show that the learning speed of the LCA algorithm is drastically faster than other Hebbian-based updating methods and independent component analysis algorithms, thanks to its dual optimality, and it does not need to use any second- or higher order statistics. We also introduce the new principle of fast learning from stable representation.
Keywords :
Hebbian learning; brain models; cellular biophysics; computational complexity; independent component analysis; neural nets; neurophysiology; probability; Hebbian learning network; Hebbian-based updating method; cell-centered mechanism; computational complexity; cortex-inspired theory; cortical model; dual-spatiotemporal components; dually optimal neuronal layers; gradient-based adaptive learning algorithm; independent component analysis algorithm; lobe component analysis; neuron network; plasticity; probability density; signal-processing characteristics; Blind source separation; Hebbian learning; cortical models; feature extraction; optimality; plasticity;
fLanguage :
English
Journal_Title :
Autonomous Mental Development, IEEE Transactions on
Publisher :
ieee
ISSN :
1943-0604
Type :
jour
DOI :
10.1109/TAMD.2009.2021698
Filename :
4895712
Link To Document :
بازگشت