• DocumentCode
    951064
  • Title

    Promoting Insight-Based Evaluation of Visualizations: From Contest to Benchmark Repository

  • Author

    Plaisant, Catherine ; Fekete, Jean-Daniel ; Grinstein, Georges

  • Author_Institution
    Univ. of Maryland, College Park
  • Volume
    14
  • Issue
    1
  • fYear
    2008
  • Firstpage
    120
  • Lastpage
    134
  • Abstract
    Information visualization (InfoVis) is now an accepted and growing field, but questions remain about the best uses for and the maturity of novel visualizations. Usability studies and controlled experiments are helpful, but generalization is difficult. We believe that the systematic development of benchmarks will facilitate the comparison of techniques and help identify their strengths under different conditions. We were involved in the organization and management of three InfoVis contests for the 2003, 2004, and 2005 IEEE InfoVis Symposia, which requested teams to report on insights gained while exploring data. We give a summary of the state of the art of evaluation in InfoVis, describe the three contests, summarize their results, discuss outcomes and lessons learned, and conjecture the future of visualization contests. All materials produced by the contests are archived in the InfoVis benchmark repository.
  • Keywords
    data visualisation; InfoVis contest; benchmark repository; information visualization; Visualization; benchmark; competition; contest; information; measure; metrics; repository; Algorithms; Benchmarking; Computer Graphics; Databases, Factual; Evaluation Studies as Topic; Image Interpretation, Computer-Assisted; Software; Software Validation; User-Computer Interface;
  • fLanguage
    English
  • Journal_Title
    Visualization and Computer Graphics, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1077-2626
  • Type

    jour

  • DOI
    10.1109/TVCG.2007.70412
  • Filename
    4359491