Title :
Practical polynomial expansion of input data can improve neurocomputing results
Author_Institution :
Ind. Res. Ltd., Auckland, New Zealand
Abstract :
Multi-layered perceptron (MLP) neurocomputing networks may be slow-learners and produce nonanalytical results; however, when presented with appropriately conditioned input data MLPs offer very good generalization results for complex classification problems. The author recounts some experiences from, and techniques used in, applying MLPs and backpropagation to find successful and better solutions to otherwise difficult practical problems, which often require the determination of real-valued numbers. The most successful contributor has been the application of a polynomial expansion technique to the input data which has helped to reduce network size, speed up the training and improve accuracy in the neurocomputing results
Keywords :
backpropagation; data preparation; feedforward neural nets; polynomials; MLPs; backpropagation; complex classification problems; feedforward perceptron; generalization results; input data; multilayered perceptron; neurocomputing results; polynomial expansion; real-valued numbers; training; Backpropagation; Image recognition; Multilayer perceptrons; Polynomials; Problem-solving; Robots; Robustness; Signal analysis; Signal processing; Spirals;
Conference_Titel :
Artificial Neural Networks and Expert Systems, 1993. Proceedings., First New Zealand International Two-Stream Conference on
Conference_Location :
Dunedin
Print_ISBN :
0-8186-4260-2
DOI :
10.1109/ANNES.1993.323086