Abstract :
When relative motion exists between the antenna of a microwave radiometer and a random background, the variance of the radiometer output can, in many cases, be primarily determined by the spatial variance of the background. The result is a decrease in signal-to-noise ratio for small target detection. In this paper an interferometer is examined as a potential means of suppressing this background variance until the limiting variance imposed by the receiver is more nearly realized. A bivariate normal distribution in conjunction with three different correlation functions is assumed for the background apparent temperature field. The elements of the interferometer are assumed to possess a very narrow "pencil" power pattern. With relative motion between antenna and background, the audio power density spectrum at the output of the interferometer is exanmned. The background variance suppression achieved by an interferometer is shown to be a function of element spacing, wavelength, correlation distance for the background, range to the background, and form of the correlation function of the background. The result is that a radiometer may be designed which is receiver-limited rather than background-limited.