DocumentCode :
3381654
Title :
Massive memory organizations for implementing neural networks
Author :
Misra, Manavendra ; Kumar, V. K Prasanna
Author_Institution :
Dept. of Electr. Eng-Syst., Univ. of Southern California, Los Angeles, CA, USA
Volume :
ii
fYear :
1990
fDate :
16-21 Jun 1990
Firstpage :
259
Abstract :
A single-input multiple-data architecture which has n processing elements and n2 memory modules arranged in an n×n array is presented. This massive memory is used to store the weights of the neural network being simulated. It is shown how networks with sparse connectivity among neurons can be simulated in O(√n+e) time. where n is the number of neurons and e the number of interconnections in the network. Preprocessing is carried out on the connection matrix of the sparse network resulting in data movement that has an optimal asymptotic time complexity and a small constant factor
Keywords :
computational complexity; content-addressable storage; digital storage; memory architecture; neural nets; parallel processing; SIMD architecture; massive memory organisations; neural networks; optimal asymptotic time complexity; reduced mesh of trees organization; single-input multiple-data architecture; sparse connectivity; Artificial neural networks; Biological neural networks; Biological system modeling; Biology computing; Computational modeling; Computer networks; Humans; Neural networks; Neurons; Sparse matrices;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 1990. Proceedings., 10th International Conference on
Conference_Location :
Atlantic City, NJ
Print_ISBN :
0-8186-2062-5
Type :
conf
DOI :
10.1109/ICPR.1990.119367
Filename :
119367
Link To Document :
بازگشت