DocumentCode :
1749078
Title :
On complexity analysis of supervised MLP-learning for algorithmic comparisons
Author :
Mizutani, Eiji ; Dreyfus, Stuart E.
Author_Institution :
Dept. of Comput. Sci., Nat. Tsing Hua Univ., Hsinchu, Taiwan
Volume :
1
fYear :
2001
fDate :
2001
Firstpage :
347
Abstract :
This paper presents the complexity analysis of a standard supervised MLP-learning algorithm in conjunction with the well-known backpropagation, an efficient method for evaluation of derivatives, in either batch or incremental learning mode. In particular, we detail the cost per epoch (i.e., operations required for processing one sweep of all the training data) using “approximate” FLOPs (floating point operations) in a typical backpropagation for solving neural networks nonlinear least squares problems. Furthermore, we identify erroneous complexity analyses found in the past NN literature. Our operation-count formula would be very useful for a given MLP architecture to compare learning algorithms
Keywords :
backpropagation; computational complexity; least squares approximations; multilayer perceptrons; backpropagation; batch learning; computational complexity; incremental learning; least squares; multilayer perceptrons; neural networks; supervised learning; Algorithm design and analysis; Backpropagation algorithms; Computer architecture; Computer science; Costs; Industrial engineering; Least squares methods; Neural networks; Operations research; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939044
Filename :
939044
Link To Document :
بازگشت