Title :
Lower bounds for the divergence of orientational estimators
Author :
Valkenburg, Robert J. ; Kakarala, Ramakrishna
Author_Institution :
Ind. Res. Ltd., Auckland, New Zealand
fDate :
9/1/2001 12:00:00 AM
Abstract :
This paper is concerned with the properties of estimators in O(n,p),the n×p orthogonal matrices. It is shown that it is natural to introduce the notion of a parallel estimator where the expected value of the estimator must lie in normal space (orthogonal complement of tangent space) of O(n,p) at the true value. An appropriate measure of variance, referred to as divergence, is introduced for a parallel estimator and a Cramer-Rao (CR) type bound is then established for the divergence. The well-known Fisher-von Mises matrix distribution is often used to model random behavior on O(n,p) and depends on parameters Θ∈O(n,p) and H a p×p symmetric matrix. The bound for this distribution is calculated for the case n=p=3 and the divergence of the maximum-likelihood estimator (MLE) of Θ is estimated by simulation. The bound is shown to be tight over a wide range of H
Keywords :
information theory; matrix algebra; maximum likelihood estimation; Cramer-Rao type bound; Fisher-von Mises matrix distribution; divergence; lower bounds; maximum-likelihood estimator; normal space; orientational estimators; orthogonal matrices; parallel estimator; random behavior; variance; Chromium; Dispersion; Laboratories; Level measurement; Linear matrix inequalities; Maximum likelihood estimation; Random variables; Statistical distributions; Symmetric matrices;
Journal_Title :
Information Theory, IEEE Transactions on