DocumentCode :
1272313
Title :
Goodness-of-fit tests based on Kullback-Leibler discrimination information
Author :
Song, Kai-Sheng
Author_Institution :
Dept. of Stat., Florida State Univ., Tallahassee, FL, USA
Volume :
48
Issue :
5
fYear :
2002
fDate :
5/1/2002 12:00:00 AM
Firstpage :
1103
Lastpage :
1117
Abstract :
We present a general methodology for developing asymptotically distribution-free goodness-of-fit tests based on the Kullback-Leibler discrimination information. The tests are shown to be omnibus within an extremely large class of nonparametric global alternatives and to have good local power. The proposed test procedure is a nonparametric extension of the classical Neyman-Pearson log-likelihood ratio test and is based on mth-order spacings between order statistics cross-validated by the observed log likelihood. The developed method also generalizes Cox´s procedure of testing separate families and covers virtually all parametric families of distributions encountered in statistics. It can also be viewed as a procedure based on sum-log functionals of nonparametric density-quantile estimators cross-validated by the log likelihood. With its good power properties, the method provides an extremely simple and potentially much better alternative to the classical empirical distribution function (EDF)-based test procedures. The important problem of selecting the order of spacings m in practice is also considered and a method based on maximizing the sample entropy constrained by the observed log likelihood is proposed. This data driven method of choosing m is demonstrated by Monte Carlo simulations to be more powerful than deterministic choices of m and thus provides a practically useful tool for implementing our test procedure
Keywords :
Monte Carlo methods; maximum entropy methods; statistical analysis; Cox´s procedure; EDF-based test procedures; Kullback-Leibler discrimination information; classical Neyman-Pearson log-likelihood ratio test; classical empirical distribution function; goodness-of-fit tests; local power; log likelihood; mth-order spacings; nonparametric density-quantile estimators; nonparametric global alternatives; order statistics; parametric families; sample entropy; sum-log functionals; Distribution functions; Entropy; Information theory; Neural networks; Parametric statistics; Power engineering and energy; Statistical analysis; Statistical distributions; Testing;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.995548
Filename :
995548
Link To Document :
بازگشت