DocumentCode :
44977
Title :
Properties of Jeffreys Mixture for Markov Sources
Author :
Takeuchi, Jun´ichi ; Kawabata, Tsutomu ; Barron, Andrew R.
Author_Institution :
Fac. of Inf. Sci. & Electr. Eng., Kyushu Univ., Fukuoka, Japan
Volume :
59
Issue :
1
fYear :
2013
fDate :
Jan. 2013
Firstpage :
438
Lastpage :
457
Abstract :
We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1} is not of the form (nx |s+α)/(ns+β), where nx |s is the number of occurrences of the symbol x following the context s ∈ {0,1} and ns=n0|s+n1|s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.
Keywords :
Markov processes; Monte Carlo methods; approximation theory; data compression; encoding; minimax techniques; prediction theory; probability; Jeffreys mixture; Markov source; Monte Carlo method; approximation formula; data sequence; minimax coding regret; minimax strategy; prediction probability; universal data compression; Approximation methods; Context; Encoding; Markov processes; Maximum likelihood estimation; Upper bound; Vectors; Bayes code; Jeffreys prior; minimax regret; stochastic complexity; universal source coding;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2012.2219171
Filename :
6307868
Link To Document :
بازگشت