Abstract :
The bit error rate performance of on-off keyed spectrally sliced burst mode receivers in the presence of dispersion is analysed for the first time. The treatment includes realistic excess noise statistics plus the effects of pulse distortion and intersymbol interference resulting from the dispersion. Numerical answers are obtained using the saddlepoint approximation, known to be both computationally efficient and accurate. The variation in power penalty with fibre path length, using the optimum optical filter bandwidth, is determined in addition to the impact of burst mode preamble lengths. The optimum ratio between the fibre path length and the distance for the root mean square width of spectrally sliced pulse to double is found to be between 0.75 and 0.76. Although the optical power penalty is found to be greater than in the case of a laser-based burst mode system, operation close to this optimum ratio reduces it to <1 dB for preambles of 4 bits or more.