Title of article :
On Bayesian learning from Bernoulli observations
Author/Authors :
M Bissiri، نويسنده , , Pier Giovanni and Walker، نويسنده , , Stephen G.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
11
From page :
3520
To page :
3530
Abstract :
We provide a reason for Bayesian updating, in the Bernoulli case, even when it is assumed that observations are independent and identically distributed with a fixed but unknown parameter θ 0 . The motivation relies on the use of loss functions and asymptotics. Such a justification is important due to the recent interest and focus on Bayesian consistency which indeed assumes that the observations are independent and identically distributed rather than being conditionally independent with joint distribution depending on the choice of prior.
Keywords :
Kullback–Leibler divergence , Loss function , Asymptotics
Journal title :
Journal of Statistical Planning and Inference
Serial Year :
2010
Journal title :
Journal of Statistical Planning and Inference
Record number :
2220999
Link To Document :
بازگشت