Title :
A simulation study of the model evaluation criterion MMRE
Author :
Foss, Tron ; Stensrud, Erik ; Kitchenham, Barbara ; Myrtveit, Ingunn
Author_Institution :
Norwegian Sch. of Manage., Sandvika, Norway
Abstract :
The mean magnitude of relative error, MMRE, is probably the most widely used evaluation criterion for assessing the performance of competing software prediction models. One purpose of MMRE is to assist us to select the best model. In this paper, we have performed a simulation study demonstrating that MMRE does not always select the best model. Our findings cast some doubt on the conclusions of any study of competing software prediction models that use MMRE as a basis of model comparison. We therefore recommend not using MMRE to evaluate and compare prediction models. At present, we do not have any universal replacement for MMRE. Meanwhile, we therefore recommend using a combination of theoretical justification of the models that are proposed together with other metrics proposed in this paper.
Keywords :
digital simulation; software cost estimation; software metrics; mean magnitude of relative error; model evaluation criterion MMRE; performance assessment; simulation study; software cost estimation; software engineering; software metrics; software prediction models; Accuracy; Analysis of variance; Classification tree analysis; Computational modeling; Costs; Predictive models; Regression analysis; Regression tree analysis; Software engineering; Software performance;
Journal_Title :
Software Engineering, IEEE Transactions on
DOI :
10.1109/TSE.2003.1245300