Title :
A statistical investigation of cost-function derivatives for neural networks with continuous activation functions
Author_Institution :
Stirling Univ., UK
Abstract :
By making simple assumptions on the distribution of potentials at the nodes of a feed-forward multilayer network with continuous activation functions the author derives analytic expressions for the mean and standard deviation of the values of the cost function and root-mean-square values of its derivatives. He shows how this information can be used to obtain systematic estimates of the range of weight-changes required in successful implementation of the iterative-improvement algorithm for the encoder problem. He chose this algorithm as an example on account of its simplicity rather than its efficacy. Thus although, for the case considered, the mean number of epochs required was about 40 compared with about 50 for backpropagation, the latter is a factor of 10 faster than iterative-improvement, in terms of CPU time per epoch
Keywords :
artificial intelligence; learning systems; neural nets; analytic expressions; backpropagation; continuous activation functions; cost-function derivatives; encoder; feed-forward multilayer network; mean deviation; neural networks; root-mean-square values; standard deviation; statistical investigation;
Conference_Titel :
Artificial Neural Networks, 1991., Second International Conference on
Conference_Location :
Bournemouth
Print_ISBN :
0-85296-531-1