Toward mixed method evaluations of scientific visualizations and design process as an evaluation tool

Bret Jackson, Dane Coffey, Lauren Thorson, David Schroeder, Arin M Ellingson, David J. Nuckley, Daniel F Keefe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evalu- ation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustra- tion, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization in- terfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and sum- mative evaluations.

Original languageEnglish (US)
Title of host publicationProceedings of the 2012 Workshop on Beyond Time and Errors - Novel Evaluation Methods for Visualization, BELIV 2012
DOIs
StatePublished - 2012
Event2012 4th Workshop on Beyond Time and Errors - Novel Evaluation Methods for Visualization, BELIV 2012 - Seattle, WA, United States
Duration: Oct 14 2012Oct 15 2012

Publication series

NameACM International Conference Proceeding Series

Other

Other2012 4th Workshop on Beyond Time and Errors - Novel Evaluation Methods for Visualization, BELIV 2012
Country/TerritoryUnited States
CitySeattle, WA
Period10/14/1210/15/12

Keywords

  • Design
  • Evaluation
  • Visualization

Fingerprint

Dive into the research topics of 'Toward mixed method evaluations of scientific visualizations and design process as an evaluation tool'. Together they form a unique fingerprint.

Cite this