Title :
Training of Large-Scale Feed-Forward Neural Networks
Author_Institution :
Leibniz-Inst. of Plant Genetics & Crop Plant Res., Gatersleben
Abstract :
Neural processing of large-scale data sets containing both many input/output variables and a large number of training examples often leads to very large networks. Once these networks become large-scale in the truest sense of the word (several ten thousand weights), two major inconveniences -or possibly a little more than that -occur: (1) conventional training algorithms perform very poorly and common knowledge about them is potentially not valid anymore, and (2) training time and even more importantly memory limitations increasingly move into the focus of attention. Both issues are addressed within this paper by means of biomedical image segmentation based on supervised neural network classification of previously extracted image features.
Keywords :
feature extraction; feedforward neural nets; image segmentation; learning (artificial intelligence); biomedical image segmentation; feed-forward neural network; image feature extraction; large-scale data set; memory limitation; neural processing; supervised neural network classification; training algorithm; Artificial neural networks; Feedforward neural networks; Feedforward systems; Focusing; Image segmentation; Large-scale systems; Network topology; Neural networks; Prototypes; Testing;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.247289