Abstract :
The behavior of orthogonal frequency division multiplexing (OFDM) signals in bandpass nonlinearity is presented. In particular, in-band biterror- rate (BIER) degradation and induced adjacent channel interferences, as a result of amplitude limiting or clipping, are analyzed. In the presence of both nonlinear distortion and additive noise, optimized output power back-off is provided to balance the requirements of the minimum BER and the tolerance of adjacent channel interference for a given OFDM system.