DocumentCode :
1245322
Title :
Lower bounds on expected redundancy for nonparametric classes
Author :
Yu, Bin
Author_Institution :
Dept. of Stat., California Univ., Berkeley, CA, USA
Volume :
42
Issue :
1
fYear :
1996
fDate :
1/1/1996 12:00:00 AM
Firstpage :
272
Lastpage :
275
Abstract :
The article focuses on lower bound results on expected redundancy for universal coding of independent and identically distributed data on [0, 1] from parametric and nonparametric families. After reviewing existing lower bounds, we provide a new proof for minimax lower bounds on expected redundancy over nonparametric density classes. This new proof is based on the calculation of a mutual information quantity, or it utilizes the relationship between redundancy and Shannon capacity. It therefore unifies the minimax redundancy lower bound proofs in the parametric and nonparametric cases
Keywords :
channel capacity; encoding; minimax techniques; redundancy; Shannon capacity; expected redundancy; independent identically distributed data; minimax lower bounds; minimax redundancy; mutual information; nonparametric density classes; nonparametric families; parametric families; universal coding; Complexity theory; Conferences; Minimax techniques; Mutual information; Parametric statistics; Q measurement; Redundancy; Stochastic processes;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.481802
Filename :
481802
Link To Document :
بازگشت