• DocumentCode
    3043131
  • Title

    CONCEPTUAL: a network correctness and performance testing language

  • Author

    Pakin, Scott

  • Author_Institution
    Comput. & Comput. Sci. Div., Los Alamos Nat. Lab., NM, USA
  • fYear
    2004
  • fDate
    26-30 April 2004
  • Firstpage
    79
  • Abstract
    Summary form only given. We introduce a new, domain-specific specification language called CONCEPTUAL. CONCEPTUAL enables the expression of sophisticated communication benchmarks and network validation tests in comparatively few lines of code. Besides helping programmers save time writing and debugging code, CONCEPTUAL addresses the important-but largely unrecognized-problem of benchmark opacity. Benchmark opacity refers to the current impracticality of presenting performance measurements in a manner that promotes reproducibility and independent evaluation of the results. For example, stating that a performance graph was produced by a "bandwidth" test says nothing about whether that test measures the data rate during a round-trip transmission or the average data rate over a number of back-to-back unidirectional messages; whether the benchmark preregisters buffers, sends warm-up messages, and/or preposts asynchronous receives before starting the clock; how many runs were performed and whether these were aggregated by taking the mean, median, or maximum; or, even whether a data unit such as "MB/s" indicates 106 or 220 bytes per second. Because CONCEPTUAL programs are terse, a benchmark\´s complete source code can be listed alongside performance results, making explicit all of the design decisions that went into the benchmark program. Because CONCEPTUAL\´s grammar is English-like, CONCEPTUAL programs can easily be understood by nonexperts. And because CONCEPTUAL is a high-level language, it can target a variety of messaging layers and networks, enabling fair and accurate performance comparisons.
  • Keywords
    message passing; performance evaluation; program debugging; specification languages; CONCEPTUAL performance testing language; back-to-back unidirectional messages; bandwidth test; benchmark preregisters buffer; debugging code; design decision; domain-specific specification language; network correctness; network validation tests; round-trip transmission; sophisticated communication benchmarks; warm-up messages; Bandwidth; Benchmark testing; Clocks; Debugging; Measurement; Performance evaluation; Programming profession; Reproducibility of results; Specification languages; Writing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Parallel and Distributed Processing Symposium, 2004. Proceedings. 18th International
  • Print_ISBN
    0-7695-2132-0
  • Type

    conf

  • DOI
    10.1109/IPDPS.2004.1303014
  • Filename
    1303014