DocumentCode :
2075426
Title :
Mixture models achieving optimal coding regret
Author :
Barron, Andrew R. ; Takeuchi, Jun-ichi
Author_Institution :
Dept. of Stat., Yale Univ., New Haven, CT, USA
fYear :
1998
fDate :
22-26 Jun 1998
Firstpage :
16
Abstract :
Summary form only given. The relative entropy between the Jeffreys mixture and Shtarkov´s normalized maximum likelihood distribution tends to zero as the sample size gets large for smooth parametric families of distributions, and equivalently, the Jeffreys mixture is asymptotically maximin for the coding regret problem. Implications of this fact reveal lower bounds, including explicit constants, that hold no matter what coding strategy is used for most sequences generated by most distributions in the family. However, for a small set of sequences the Jeffreys mixture and the normalized maximum likelihood do not agree asymptotically. That is, the Jeffreys mixture is not asymptotically minimax. This discrepancy occurs not only for sequences with maximum likelihood parameter values near the boundary, but also, when the parametric family is not of exponential type, for sequences for which the empirical Fisher information is not close to the Fisher information at the MLE. We show how the non-exponential parametric families can be modified by an exponential tilting using the empirical Fisher information to define a somewhat higher-dimensional family. A mixture defined on the extended family, with most of the mass on the original family, is shown to have asymptotically minimax coding regret
Keywords :
encoding; entropy; maximum likelihood estimation; minimax techniques; Fisher information; Jeffreys mixture; MLE; asymptotically maximin mixture; boundary; explicit constants; exponential tilting; higher-dimensional family; lower bounds; maximum likelihood parameter values; mixture models; nonexponential parametric families; normalized maximum likelihood distribution; optimal coding regret; relative entropy; sample size; sequences; smooth parametric families; Data compression; Entropy; Information theory; Maximum likelihood estimation; Minimax techniques; National electric code; Parametric statistics; Predictive models; Statistical distributions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Workshop, 1998
Conference_Location :
Killarney
Print_ISBN :
0-7803-4408-1
Type :
conf
DOI :
10.1109/ITW.1998.706377
Filename :
706377
Link To Document :
بازگشت