Title of article :
Whoʹs afraid of reduced-rank parameterizations of multivariate models? Theory and example
Author/Authors :
Gilbert، نويسنده , , Scott and Zem??k، نويسنده , , Petr، نويسنده ,
Issue Information :
دوفصلنامه با شماره پیاپی سال 2006
Abstract :
Reduced-rank restrictions can add useful parsimony to coefficient matrices of multivariate models, but their use is limited by the daunting complexity of the methods and their theory. The present work takes the easy road, focusing on unifying themes and simplified methods. For Gaussian and non-Gaussian (GLM, GAM, mixed normal, etc.) multivariate models, the present work gives a unified, explicit theory for the general asymptotic (normal) distribution of maximum likelihood estimators (MLE). MLE can be complex and computationally hard, but we show a strong asymptotic equivalence between MLE and a relatively simple minimum (Mahalanobis) distance estimator. The latter method yields particularly simple tests of rank, and we describe its asymptotic behavior in detail. We also examine the methodʹs performance in simulation and via analytical and empirical examples.
Keywords :
Regression , Coefficient matrix , Reduced-rank , Estimation , asymptotic theory , Test , Multivariate model
Journal title :
Journal of Multivariate Analysis
Journal title :
Journal of Multivariate Analysis