Abstract :
For n > 1 let X = (X1,…,Xn)′ have a mean vector θ1 and covariance matrix σ2Σ, where 1 = (1,…,1)′, Σ is a known positive definite matrix, and σ2 > 0 is either known or unknown. This model has been found useful when the observations X1,…,Xn from a population with mean θ are not independent. We show how the variance of , the least-squares estimator of θ, depends on the covariance structure of Σ. More specifically, we give expressions for Var( ), obtain its lower and upper bounds (which involve only the smallest and the largest eigenvalues of Σ), and show how the dependence of X1,…,Xn plays a role in Var . Examples of applications are given for M-matrices, for exchangeable random variables, for a class of covariance matrices with a block-correlation structure, and for twin data.