It is assumed that a signal consisting of a constant plus at most

sinusoids and corrupted by noise is observed at equally spaced time intervals for a finite length of time. An optimum least-mean-square error estimate of the spectral components (i.e., the mean value and the amplitude, phase, and frequency of each sinusoidal component) of the signal is derived based on a large signal-to-noise ratio approximation. The estimates for the sampled values of the signal (and therefore the estimate for the mean-square error) are obtained explicitly in terms of the observed sampled data. Similarly, the estimate for the mean value of the signal is obtained explicitly. The estimates for the remaining spectral components of the signal are obtained implicitly requiring the solution of an

th degree algebraic equation.