DocumentCode :
846591
Title :
SPEC as a performance evaluation measure
Author :
Giladi, Ran ; Ahitav, N.
Author_Institution :
Ben-Gurion Univ. of the Negev, Beer-Sheva, Israel
Volume :
28
Issue :
8
fYear :
1995
fDate :
8/1/1995 12:00:00 AM
Firstpage :
33
Lastpage :
42
Abstract :
Potential computer system users or buyers usually employ a computer performance evaluation technique only if they believe its results provide valuable information. System Performance Evaluation Cooperative (SPEC) measures are perceived to provide such information and are therefore the ones most commonly used. SPEC measures are designed to evaluate the performance of engineering and scientific workstations, personal vector computers, and even minicomputers and superminicomputers. Along with the Transaction Processing Council (TPC) measures for database I/O performance, they have become de facto industry standards, but do SPEC´s evaluation outcomes actually provide added information value? In this article, we examine these measures by considering their structure, advantages and disadvantages. We use two criteria in our examination: are the programs used in the SPEC suite properly blended to reflect a representative mix of different applications, and are they properly synthesized so that the aggregate measures correctly rank computers by performance? We conclude that many programs in the SPEC suites are superfluous; the benchmark size can be reduced by more than 50%. The way the measure is calculated may cause distortion. Substituting the harmonic mean for the geometric mean used by SPEC roughly preserves the measure, while giving better consistency. SPEC measures reflect the performance of the CPU rather than the entire system. Therefore, they might be inaccurate in ranking an entire system. To remedy these problems, we propose a revised methodology for obtaining SPEC measures
Keywords :
performance evaluation; CPU performance; SPEC; System Performance Evaluation Cooperative measures; added information value; aggregate measures; applications mix; benchmark size; consistency; de facto industry standard; distortion; engineering workstations; geometric mean; harmonic mean; minicomputers; performance evaluation measure; personal vector computers; ranking; scientific workstations; superminicomputers; Application software; Computer performance; Councils; Design engineering; Distortion measurement; Measurement standards; Microcomputers; System performance; Transaction databases; Workstations;
fLanguage :
English
Journal_Title :
Computer
Publisher :
ieee
ISSN :
0018-9162
Type :
jour
DOI :
10.1109/2.402073
Filename :
402073
Link To Document :
بازگشت