DocumentCode
276664
Title
Generalization accuracy of probabilistic neural networks compared with backpropagation networks
Author
Specht, Donald F. ; Shapiro, Philip D.
Author_Institution
Lockheed Palo Alto Res. Lab., CA, USA
Volume
i
fYear
1991
fDate
8-14 Jul 1991
Firstpage
887
Abstract
The authors demonstrate that probabilistic neural networks (PNN) and backpropagation networks (BPN) generalize comparably for a wide variety of low- and high-dimensional artificial databases. A training time advantage is most important when exploring new databases and preprocessing techniques to determine classification accuracies for potential applications. Since it is demonstrated that classification accuracy can be determined equally well using either PNN and BPN, it is advantageous to use PNN for this stage of a classification project. It is during this phase that most of the effort goes into training a network and a relatively small amount of effort goes into testing and evaluating the accuracy of the network. The PNN is based on the Bayes strategy for decision making and Parzen window estimation, and is asymptotically Bayes-optimal within a given feature space
Keywords
learning systems; neural nets; pattern recognition; probability; Bayes strategy; Parzen window estimation; asymptotically Bayes-optimal; backpropagation networks; classification accuracies; databases; decision making; generalisation accuracy; learning systems; pattern recognition; preprocessing techniques; probabilistic neural networks; training time; Art; Artificial neural networks; Backpropagation; Decision making; Laboratories; Neural networks; Neurons; Pattern recognition; Testing; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location
Seattle, WA
Print_ISBN
0-7803-0164-1
Type
conf
DOI
10.1109/IJCNN.1991.155296
Filename
155296
Link To Document