DocumentCode :
1087100
Title :
The asymptotics of posterior entropy and error probability for Bayesian estimation
Author :
Kanaya, Fumio ; Te Sun Han
Author_Institution :
Shonan Inst. of Technol., Fujisawa, Japan
Volume :
41
Issue :
6
fYear :
1995
fDate :
11/1/1995 12:00:00 AM
Firstpage :
1988
Lastpage :
1992
Abstract :
We consider the Bayesian parameter estimation problem where the value of a finitary parameter X should be decided on the basis of i.i.d. sample Yn of size n. In this context, the amount of missing information on X after observing Yn may be evaluated by the posterior entropy, which is often called the equivocation or the conditional entropy, of X given Yn, while it is well known that the minimum possible probability of error in estimating X is achieved by the maximum a posteriori probability (MAP) estimator. In this work, the focus is on the asymptotic relation between the posterior entropy and the MAP error probability as the sample size n becomes sufficiently large. It is shown that if the sample size n is large enough, the posterior entropy as well as the MAP error probability decay with n to zero at the identical exponential rate, and that the maximum achievable exponent for this decay is determined by the minimum Chernoff information over all the possible pairs of distinct parameter values
Keywords :
Bayes methods; entropy; error statistics; maximum likelihood estimation; probability; Bayesian estimation; MAP error probability; MAP estimator; asymptotics; conditional entropy; equivocation entropy; error probability; information theory; maximum a posteriori probability estimator; minimum Chernoff information; parameter estimation; posterior entropy; Bayesian methods; Entropy; Error probability; Parameter estimation; Probability distribution; Random variables; Sun; Tellurium; Upper bound;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.476321
Filename :
476321
Link To Document :
بازگشت