Title of article :
Common Principal Components for Dependent Random Vectors
Author/Authors :
Neuenschwander، نويسنده , , Beat E and Flury، نويسنده , , Bernard D، نويسنده ,
Issue Information :
دوفصلنامه با شماره پیاپی سال 2000
Abstract :
Let the kp-variate random vector X be partitioned into k subvectors Xi of dimension p each, and let the covariance matrix Ψ of X be partitioned analogously into submatrices Ψij. The common principal component (CPC) model for dependent random vectors assumes the existence of an orthogonal p by p matrix β such that βtΨijβ is diagonal for all (i, j). After a formal definition of the model, normal theory maximum likelihood estimators are obtained. The asymptotic theory for the estimated orthogonal matrix is derived by a new technique of choosing proper subsets of functionally independent parameters.
Keywords :
eigenvector , Entropy , maximum likelihood estimation , multivariate normal distribution , patterned covariance matrices , eigenvalue , Asymptotic distribution
Journal title :
Journal of Multivariate Analysis
Journal title :
Journal of Multivariate Analysis