Let

be independent identically distributed random variables taking values in a finite set

and consider the conditional joint distribution of the first m elements of the sample

on the condition that

and the sliding block sample average of a function

defined on

exceeds a threshold

. For

fixed and

, this conditional joint distribution is shown to converge m the

-step joint distribution of a Markov chain started in

which is closest to

in Kullback-Leibler information divergence among all Markov chains whose two-dimensional stationary distribution

satisfies

, provided some distribution

on

having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained when

is an arbitrary finite-order Markov chain and more general conditioning is allowed.