The maximum improvement that can be expected from amplitude and/or phase adaptivity in the performance of an optimum receiver is evaluated in detail for a general class of broad- and narrow-band signals subject to slow Rayleigh fading in the "on-off" mode of operation. The pertinent Bayes risks are evaluated and compared for different states of a priori information. It is found that amplitude adaptivity yields an advantage increasing without bounds with the SNR, whereas phase adaptivity is a decreasing function of the SNR ratio. For the interesting region of moderately low error probabilities, and with equal a priori error risks, amplitude adaptivity yields an improvement in performance equivalent to

dB in SNR, whereas phase adaptivity yields an improvement of

or

dB depending on whether the amplitude also is known or not. The effect of asymmetry of a priori error risks is discussed. It is also pointed out that the optimum test for deciding on the presence or absence of the signal is uniformly most powerful with respect to the amplitude, so that no advantage can be expected from amplitude adaptivity in the Neyman-Pearson mode of operation.