DocumentCode
982805
Title
Backpropagation uses prior information efficiently
Author
Barnard, Etienne ; Botha, Elizabeth C.
Author_Institution
Dept. of Comput. Sci. & Eng., Oregon Graduate Inst., Portland, OR, USA
Volume
4
Issue
5
fYear
1993
fDate
9/1/1993 12:00:00 AM
Firstpage
794
Lastpage
802
Abstract
The ability of neural net classifiers to deal with a priori information is investigated. For this purpose, backpropagation classifiers are trained with data from known distributions with variable a priori probabilities, and their performance on separate test sets is evaluated. It is found that backpropagation employs a priori information in a slightly suboptimal fashion, but this does not have serious consequences on the performance of the classifier. Furthermore, it is found that the inferior generalization that results when an excessive number of network parameters are used can (partially) be ascribed to this suboptimality
Keywords
backpropagation; generalisation (artificial intelligence); neural nets; pattern recognition; probability; a priori information; backpropagation; generalization; neural net classifiers; probabilities; Backpropagation; Computer science; Humans; Information resources; Least squares approximation; Neural networks; Pattern recognition; Speech; Springs; Testing;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.248457
Filename
248457
Link To Document