DocumentCode :
1458416
Title :
A minimum discrepancy estimator in parameter estimation
Author :
Chang, Shyang ; Chang, Yen-Ching ; Chang, Chen-Yu
Author_Institution :
Dept. of Electr. Eng., Nat. Tsing Hua Univ., Hsinchu, Taiwan
Volume :
44
Issue :
7
fYear :
1998
fDate :
11/1/1998 12:00:00 AM
Firstpage :
2930
Lastpage :
2942
Abstract :
In statistical estimation theory, a satisfactory estimator should be able to embody a large portion of the available information, which may be known a priori or provided by the data. Hence, the loss of information is minimum when this estimator is employed. In previous work, an estimator criterion based on the discrepancy between the estimator´s error covariance and its information lower bound was proposed. Conceptually, this criterion is a measure of the loss of information carried by a parameter estimator based on the Bayesian approach. A minimum discrepancy estimator (MDE) was derived under the linearity assumption. It was, however, pointed out that the minimal information loss could not be guaranteed by the linear version. Moreover, some good asymptotic properties were not obtainable. Therefore, in this paper, the existence and uniqueness conditions of the general MDE are studied under certain regularity conditions. The MDE can be obtained by solving a Fredholm equation of the second kind. Furthermore, it is shown to be consistent and asymptotically efficient. As a result, the MDE is ensured to have the minimum loss of information in finite samples and no loss of information when sample size tends to infinity. Examples indicate that if the prior information is vague, the MDE is superior to the minimum variance estimator (MVE) in terms of information loss. If the prior distribution is suitably chosen, the MDE is superior to the maximum-likelihood estimator (MLE) on the basis of deficiency
Keywords :
Bayes methods; Fredholm integral equations; covariance analysis; information theory; parameter estimation; Bayesian approach; Fredholm equation; asymptotic properties; error covariance; estimator criterion; finite samples; information loss; information lower bound; linearity assumption; minimum discrepancy estimator; parameter estimation; prior information; regularity conditions; sample size; statistical estimation; Bayesian methods; Cramer-Rao bounds; Equations; Estimation theory; H infinity control; Linearity; Loss measurement; Maximum likelihood estimation; Model driven engineering; Parameter estimation;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.737523
Filename :
737523
Link To Document :
بازگشت