DocumentCode :
2989879
Title :
Achievability results for statistical learning under communication constraints
Author :
Raginsky, Maxim
Author_Institution :
Dept. of Electr. & Comput. Eng., Duke Univ., Durham, NC, USA
fYear :
2009
fDate :
June 28 2009-July 3 2009
Firstpage :
1328
Lastpage :
1332
Abstract :
The problem of statistical learning is to construct an accurate predictor of a random variable as a function of a correlated random variable on the basis of an i.i.d. training sample from their joint distribution. Allowable predictors are constrained to lie in some specified class, and the goal is to approach asymptotically the performance of the best predictor in the class. We consider two settings in which the learning agent only has access to rate-limited descriptions of the training data, and present information-theoretic bounds on the predictor performance achievable in the presence of these communication constraints. Our proofs do not assume any separation structure between compression and learning and rely on a new class of operational criteria specifically tailored to joint design of encoders and learning algorithms in rate-constrained settings.
Keywords :
information theory; statistical analysis; communication constraints; encoders algorithms; learning algorithms; random variable; statistical learning; Algorithm design and analysis; Biological information theory; Biological system modeling; Input variables; Probability distribution; Random variables; Statistical learning; Training data; Uncertainty; Vector quantization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2009. ISIT 2009. IEEE International Symposium on
Conference_Location :
Seoul
Print_ISBN :
978-1-4244-4312-3
Electronic_ISBN :
978-1-4244-4313-0
Type :
conf
DOI :
10.1109/ISIT.2009.5205933
Filename :
5205933
Link To Document :
بازگشت