Title of article :
Variable selection in qualitative models via an entropic explanatory power
Author/Authors :
Dupuis، Jérome A. نويسنده , , Robert، Christian P. نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2003
Pages :
-76
From page :
77
To page :
0
Abstract :
The variable selection method proposed in the paper is based on the evaluation of the Kullback–Leibler distance between the full (or encompassing) model and its submodels. The Bayesian implementation of the method does not require a separate prior modeling on the submodels since the corresponding parameters for the submodels are defined as the Kullback–Leibler projections of the full model parameters. The result of the selection procedure is the submodel with the smallest number of covariates which is at an acceptable distance of the full model. We introduce the notion of explanatory power of a model and scale the maximal acceptable distance in terms of the explanatory power of the full model. Moreover, an additivity property between embedded submodels shows that our selection procedure is equivalent to select the submodel with the smallest number of covariates which has a sufficient explanatory power. We illustrate the performances of this method on a breast cancer dataset
Keywords :
Correlation , Gibbs sampling , Historical data , Poisson Regression , Prior distribution , Random effects
Journal title :
Journal of Statistical Planning and Inference
Serial Year :
2003
Journal title :
Journal of Statistical Planning and Inference
Record number :
73275
Link To Document :
بازگشت