Because of the current strong emphasis on low sidelobe antennas, the effects of measurement distance in distorting patterns are reexamined. Previous calculations have used obsolete or suboptimum aperture distributions. The Taylor

linear distribution is a versatile highly efficient and robust optimum distribution; its use here allows a single curve of sidelobe measurement error versus measurement distance (normalized to far field distance

) for a given sidelobe level. The calculations give data from a uniform distribution to a 60 dB Taylor. For example, the first sidelobe of a 40 dB Taylor pattern is in error 1 dB at a distance of

.