In recent years there has been increasing interest in autoregressive spectrum estimation. This procedure fits a finite autoregression to the time series data, and calculates the spectrum from the estimated autoregression coefficients and the one step prediction error variance. For multivariate time series, the estimated autoregressive matrices and one step prediction covariance matrix produce estimates of the spectra, coherences, phases, and group delays. The use of Akaike\´s information criterion (AIC) for identification of the order of the autoregression to be used makes the procedure objective. Experience gained from analyzing large amounts of data from the biological and physical sciences has indicated that AIC works very well for model identification when compared to more subjective procedures such as the examination of partial

-statistics. This experience has also indicated that using both autoregressive spectrum estimation and classical spectrum estimation and superimposing the plots gives a much stronger feeling for the shape of the true spectrum being estimated. The results of some of these analyses are presented.