DocumentCode
1985281
Title
Method of computing gradient vector and Jacobean matrix in arbitrarily connected neural networks
Author
Wilamowski, Bodgan M. ; Cotton, Nicholas J. ; Kaynak, Okyay ; Dündar, Günhan
Author_Institution
Auburn Univ., Auburn
fYear
2007
fDate
4-7 June 2007
Firstpage
3298
Lastpage
3303
Abstract
The paper shows that it fully connected neural networks are used then the same problem can be solved with less number of neurons and weights. Interestingly such networks are trained faster. The problem is that most of the neural networks terming algorithms are not suitable for such network. Presented algorithm and software allow training feedforwad neural networks with arbitrarily connected neurons in similar way as the SPICE program can analyze any circuit topology. When the second order algorithm is used (for which Jacobean must be calculated) solution is obtained about 100 times faster.
Keywords
Jacobian matrices; feedforward neural nets; gradient methods; mathematics computing; Jacobean matrix; arbitrarily connected neurons; connected neural networks; feedforwad neural networks; gradient vector; Computer networks; Cotton; Feedforward neural networks; Jacobian matrices; Network topology; Neural networks; Neurons; Perturbation methods; SPICE; Software algorithms;
fLanguage
English
Publisher
ieee
Conference_Titel
Industrial Electronics, 2007. ISIE 2007. IEEE International Symposium on
Conference_Location
Vigo
Print_ISBN
978-1-4244-0754-5
Electronic_ISBN
978-1-4244-0755-2
Type
conf
DOI
10.1109/ISIE.2007.4375144
Filename
4375144
Link To Document