DocumentCode :
2056787
Title :
Improved convergence rates in empirical vector quantizer design
Author :
Antos, András ; Györfi, László ; György, András
Author_Institution :
Informatics Lab., Hungarian Acad. of Sci., Budapest
fYear :
2004
fDate :
2004
Firstpage :
300
Lastpage :
300
Abstract :
We consider the rate of convergence of the expected distortion redundancy of empirically optimal vector quantizers. Earlier results show that the mean-squared distortion of an empirically optimal quantizer designed from n independent and identically distributed source samples converges uniformly to the optimum at a rate O(1/radicn), and that this rate is sharp in the minimax sense. We prove that for any fixed source distribution supported on a given finite set, the convergence rate is O(1/n) (faster than the minimax lower bound), where the corresponding constant depends on the distribution. For more general source distributions, we provide conditions implying a little bit worse O(log n/n) rate of convergence. In particular, scalar distributions having strictly log-concave densities with bounded support (such as the truncated Gaussian distribution) satisfy these conditions
Keywords :
convergence; source coding; vector quantisation; convergence rate; distortion redundancy; fixed source distribution; optimal vector quantizers; Automation; Convergence; Gaussian distribution; Informatics; Laboratories; Mathematics; Minimax techniques; Q measurement; Statistical distributions; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2004. ISIT 2004. Proceedings. International Symposium on
Conference_Location :
Chicago, IL
Print_ISBN :
0-7803-8280-3
Type :
conf
DOI :
10.1109/ISIT.2004.1365337
Filename :
1365337
Link To Document :
بازگشت