Title of article
On Bayesian learning via loss functions
Author/Authors
Giovanni Bissiri، نويسنده , , Pier and Walker، نويسنده , , Stephen G.، نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2012
Pages
7
From page
3167
To page
3173
Abstract
We provide a decision theoretic approach to the construction of a learning process in the presence of independent and identically distributed observations. Starting with a probability measure representing beliefs about a key parameter, the approach allows the measure to be updated via the solution to a well defined decision problem. While the learning process encompasses the Bayesian approach, a necessary asymptotic consideration then actually implies the Bayesian learning process is best. This conclusion is due to the requirement of posterior consistency for all models and of having standardized losses between probability distributions. This is shown considering a specific continuous model and a very general class of discrete models.
Keywords
Bayesian inference , Loss function , Posterior distribution , Kullback–Leibler divergence , g-Divergence
Journal title
Journal of Statistical Planning and Inference
Serial Year
2012
Journal title
Journal of Statistical Planning and Inference
Record number
2222167
Link To Document