DocumentCode :
3289775
Title :
Parametric connectivity: feasibility of learning in constrained weight space
Author :
Caudell, Thomas P.
Author_Institution :
Hughes Res. Lab., Malibu, CA, USA
fYear :
1989
fDate :
0-0 1989
Firstpage :
667
Abstract :
Consideration is given to the impact on the performance of selected learning algorithms when specific artificial neural models are constrained. The particular model of constraint under consideration is parametric connectivity (PC), in which the weights of the incoming links are constrained to be a function of a relatively small number of parameters. This can, in principle, be implemented in an electrooptical system, using such devices as photodetectors, miniature electrooptical cells, and laser diodes. Low-resolution holographic mirrors may be used to direct the global structure of the network architecture. A simulation using PC has been developed. Currently, layered PC networks that implement simple logic functions are being investigated. The performance of networks that use PC units (PCU) is measured. PC is incorporated into the generalized delta rule and into genetic algorithms to measure learning capacity. PC allows almost complete generality in network implementation, while taking advantage of optical system performance.<>
Keywords :
learning systems; neural nets; optical logic; artificial neural models; constrained weight space; delta rule; electrooptical system; genetic algorithms; learning algorithms; logic functions; neural nets; parametric connectivity; simulation; Learning systems; Neural networks; Optical logic devices;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118650
Filename :
118650
Link To Document :
بازگشت