DocumentCode :
276663
Title :
Objective functions for probability estimation
Author :
Miller, John W. ; Goodman, Rod ; Smyth, Padhraic
Author_Institution :
California Inst. of Technol., Pasadena, CA, USA
Volume :
i
fYear :
1991
fDate :
8-14 Jul 1991
Firstpage :
881
Abstract :
The authors generalize and extend previously known results on obtaining probability estimates from neural network classifiers. In particular, the authors derive necessary and sufficient conditions for an objective function which minimizes to a probability. The objective function L(x,t) was found to be uniquely specified by the function L(x,0). This function L (x,0) was found to satisfy further restrictions when a condition of logical symmetry is required. These restrictions and the relation between L(x,t) and L(x ,0) define the class of all objective functions which minimize to a probability. The two simplest functions in this class were found to be the well-known mean-squared error and cross entropy objective functions
Keywords :
entropy; neural nets; pattern recognition; probability; cross entropy; logical symmetry; mean-squared error; necessary and sufficient conditions; neural network classifiers; objective function; pattern recognition; probability estimation; Backpropagation; Context modeling; Entropy; Laboratories; Mean square error methods; Neural networks; Pattern analysis; Probability; Propulsion; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155295
Filename :
155295
Link To Document :
بازگشت