DocumentCode :
2466606
Title :
A scalable analog architecture for neural networks with on-chip learning and refreshing
Author :
Alhalabi, B.A. ; Bayoumi, Magdy
Author_Institution :
Center for Adv. Comput. Studies, Univ. of Southwestern Louisiana, Lafayette, LA, USA
fYear :
1995
fDate :
16-18 Mar 1995
Firstpage :
33
Lastpage :
38
Abstract :
This paper discusses various techniques for analog storage and handling and proposes a new class of architecture suitable for modular and scalable analog neural networks with on-chip learning and refreshing. The new architecture is based on analog functional blocks and analog pass switches which enhance the system versatility. Supporting algorithms are also developed. A novel characteristic is the full-analog on-chip learning methodology which substantially increases the learning speed. The speedup is evidenced by the utilization of local analog synaptic updating scheme which utterly eliminates time-sharing components. Moreover, this localization scheme conceives unbounded scalability in this neural architecture
Keywords :
analogue processing circuits; analogue storage; learning (artificial intelligence); neural chips; analog functional blocks; analog pass switches; analog storage; learning speed; local analog synaptic updating scheme; neural networks; on-chip learning; on-chip refreshing; scalable analog architecture; system versatility; unbounded scalability; Analog computers; Capacitors; Computer architecture; Computer networks; Hardware; Network-on-a-chip; Neural networks; Silicon; Switches; System-on-a-chip;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
VLSI, 1995. Proceedings., Fifth Great Lakes Symposium on
Conference_Location :
Buffalo, NY
ISSN :
1066-1395
Print_ISBN :
0-8186-7035-5
Type :
conf
DOI :
10.1109/GLSV.1995.516020
Filename :
516020
Link To Document :
بازگشت