Abstract :
We study the detection of Gauss-Markov signals using randomly spaced sensors. We derive a lower bound on the Bayesian detection error based on the Kullback-Leibler divergence, and from this, define an error exponent. We then evaluate the error exponent for stationary and non-stationary Gauss-Markov models where the sensor spacings, di, di,..., are drawn independently from a common distribution Fd. In both models, error exponents take on simple forms involving the parameters of the Markov process and expectations over Fd of certain functions of di. These expressions are evaluated explicitly when Fd corresponds to (i) exponentially distributed sensors with placement density lambda (ii) equally spaced sensors, and (iii) the proceeding cases when sensors also fail with probability Q. Many insights follow. For example, in the non-stationary case, we determine the optimal lambda as a function of q. Numerical simulations show that the error exponent, based on an asymptotic analysis of the lower bound, predicts trends of the actual error rate accurately, even for small data sizes.
Keywords :
Bayes methods; Gaussian processes; Markov processes; error detection; signal detection; Bayesian detection; Gauss-Markov signals; Kullback-Leibler divergence; error exponent; randomly spaced sensors; Acoustic sensors; Acoustic signal detection; Bayesian methods; Gaussian processes; Laboratories; Markov processes; Numerical simulation; Random variables; Signal detection; Testing; Distributed Bayesian detection;