Title of article :
Extension of the PAC framework to finite and countable Markov chains
Author/Authors :
D.، Gamarnik, نويسنده ,
Issue Information :
ماهنامه با شماره پیاپی سال 2003
Pages :
-337
From page :
338
To page :
0
Abstract :
We consider a model of learning in which the successive observations follow a certain Markov chain. The observations are labeled according to a membership to some unknown target set. For a Markov chain with finitely many states we show that, if the target set belongs to a family of sets with a finite Vapnik-Chervonenkis (1995) dimension, then probably approximately correct (PAC) learning of this set is possible with polynomially large samples. Specifically for observations following a random walk with a state space (...) and uniform stationary distribution, the sample size required is no more than (omega)(t/sub 0//1-(lambda)/sub 2/log(t/sub 0/|(chi)|1/(delta))), where (delta) is the confidence level, (lambda)/sub 2/ is the second largest eigenvalue of the transition matrix, and t/sub 0/ is the sample size sufficient for learning from independent and identically distributed (i.i.d.) observations. We then obtain similar results for Markov chains with countably many states using Lyapunov function technique and results on mixing properties of infinite state Markov chains.
Keywords :
Patients
Journal title :
IEEE Transactions on Information Theory
Serial Year :
2003
Journal title :
IEEE Transactions on Information Theory
Record number :
94818
Link To Document :
بازگشت