We discuss the transient behavior of a delay-locked loop which is designed to generate a delay-error signal that is proportional to the difference in the autocorrelation function of the input signal at two points separated by a fixed time

. When the input signal is a sine wave, we present an exact solution which shows that the system is stable and achieves a delay lock with an ambiguity of an integral number of periods. The second input considered is that of a stationary, ergodic, and band-limited Gaussian signal. In this case we present an approximate analysis which predicts that for times long compared to the inverse bandwidth of the random signal that the delay error is log-normally distributed. For this case we develop the almost sure sample stability criterion [1]. When this criterion is met, the system sample solutions are stable with probability one independant of the system amplification. We also develop stability criteria which limit the system amplification for stability of the first and second moments of the time delay.