Title of article :
Imprecise probabilities for representing ignorance about a parameter Original Research Article
Author/Authors :
Serafin Moral، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Pages :
16
From page :
347
To page :
362
Abstract :
This paper distinguishes between objective probability—or chance—and subjective probability. Most statistical methods in machine learning are based on the hypothesis that there is a random experiment from which we get a set of observations. This random experiment could be identified with a chance or objective probability, but these probabilities depend on some unknown parameters. Our knowledge of these parameters is not objective and in order to learn about them, we must assess some epistemic probabilities about their values. In some cases, our objective knowledge about these parameters is vacuous, so the question is: What epistemic probabilities should be assumed? In this paper we argue for the assumption of non-vacuous (a proper subset of [0, 1]) interval probabilities. There are several reasons for this; some are based on the betting interpretation of epistemic probabilities while others are based on the learning capabilities under the vacuous representation. The implications of the selection of epistemic probabilities in different concepts as conditioning and learning are studied. It is shown that in order to maintain some reasonable learning capabilities we have to assume more informative prior models than those frequently used in the literature, such as the imprecise Dirichlet model.
Keywords :
Learning , Chance , Ignorance , Degree of belief , Imprecise probability , Conditioning
Journal title :
International Journal of Approximate Reasoning
Serial Year :
2012
Journal title :
International Journal of Approximate Reasoning
Record number :
1183111
Link To Document :
بازگشت