Artifact-based rendering: Harnessing natural and traditional visual media for more expressive and engaging 3d visualizations

Seth Johnson, Francesca Samsel, Gregory Abram, Daniel Olson, Andrew J. Solis, Bridger Herman, Phillip J. Wolfram, Christophe Lenglet, Daniel F. Keefe

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

We introduce Artifact-Based Rendering (ABR), a framework of tools, algorithms, and processes that makes it possible to produce real, data-driven 3D scientific visualizations with a visual language derived entirely from colors, lines, textures, and forms created using traditional physical media or found in nature. A theory and process for ABR is presented to address three current needs: (i) designing better visualizations by making it possible for non-programmers to rapidly design and critique many alternative data-To-visual mappings; (ii) expanding the visual vocabulary used in scientific visualizations to depict increasingly complex multivariate data; (iii) bringing a more engaging, natural, and human-relatable handcrafted aesthetic to data visualization. New tools and algorithms to support ABR include front-end applets for constructing artifact-based colormaps, optimizing 3D scanned meshes for use in data visualization, and synthesizing textures from artifacts. These are complemented by an interactive rendering engine with custom algorithms and interfaces that demonstrate multiple new visual styles for depicting point, line, surface, and volume data. A within-The-research-Team design study provides early evidence of the shift in visualization design processes that ABR is believed to enable when compared to traditional scientific visualization systems. Qualitative user feedback on applications to climate science and brain imaging support the utility of ABR for scientific discovery and public communication.

Original languageEnglish (US)
Article number8794607
Pages (from-to)492-502
Number of pages11
JournalIEEE Transactions on Visualization and Computer Graphics
Volume26
Issue number1
DOIs
StatePublished - Jan 2020

Bibliographical note

Funding Information:
This research was supported in part by the National Science Foundation (IIS-1704604 & IIS-1704904). Brain microstructure applications were supported in part by the National Institutes of Health (P41 EB015894, P30 NS076408). MPAS-O simulations were conducted by Mathew E. Maltrud and Riley X. Brady as part of the Energy Exascale Earth System Model (E3SM) project, funded by the U.S. Department of Energy (DOE), Office of Science, Office of Biological and Environmental Research with analyses conducted by PJW, MEM, and RXB under ARPA-E Funding Opportunity No. DE-FOA-0001726, MARINER Award 17/CJ000/09/01, Pacific Northwest National Laboratory, prime recipient.

Keywords

  • Art and Visualization
  • Data Physicalization
  • Multivariate Visualization
  • Visualization Design

PubMed: MeSH publication types

  • Journal Article
  • Research Support, N.I.H., Extramural
  • Research Support, U.S. Gov't, Non-P.H.S.

Fingerprint Dive into the research topics of 'Artifact-based rendering: Harnessing natural and traditional visual media for more expressive and engaging 3d visualizations'. Together they form a unique fingerprint.

Cite this