DocumentCode :
942540
Title :
An asymptotically least-favorable Chernoff bound for a large class of dependent data processes
Author :
Sadowsky, John S.
Volume :
33
Issue :
1
fYear :
1987
fDate :
1/1/1987 12:00:00 AM
Firstpage :
52
Lastpage :
61
Abstract :
It is desired to determine the worst-case asymptotic error probability performance of a given detector operating in an environment of uncertain data dependency. A class of Markov data process distributions is considered which satisfy a one-shift dependency bound and agree with a specified univariate distribution. Within this dependency contamination class the distribution structure which minimizes the exponential rate of decrease of detection error probabilities is identified. This is a uniform least-favorability principle, because the least-favorable dependency structure is the same for all bounded memoryless detectors. The error probability exponential rate criterion used is a device of large deviations theory. The results agree well with previous results obtained using Pitman´s asymptotic relative efficiency (ARE), which is a more tractable small-signal performance criterion. In contrast to ARE, large deviations theory is closely related to finite-sample error probabilities via the finite-sample Chernoff bounds and other exponentially tight bounds and other approximations.
Keywords :
Markov processes; Signal detection; Autocorrelation; Contamination; Degradation; Detectors; Error probability; Pollution measurement; Reactive power;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1987.1057270
Filename :
1057270
Link To Document :
بازگشت