Rapid deployment of smartphone-based augmented reality tools for field and online education in structural biology

Tanner G. Hoog, Lauren M. Aufdembrink, Nathaniel J. Gaut, Rou Jia Sung, Katarzyna P. Adamala, Aaron E. Engelhart

Research output: Contribution to journalArticlepeer-review

Abstract

Structural biology education commonly employs molecular visualization software, such as PyMol, RasMol, and VMD, to allow students to appreciate structure–function relationships in biomolecules. In on-ground, classroom-based education, these programs are commonly used on University-owned devices with software preinstalled. Remote education typically involves the use of student-owned devices, which complicates the use of such software, owing to the fact that (a) student devices have differing configurations (e.g., Windows vs MacOS) and processing power, and (b) not all student devices are suitable for use with such software. Smartphones are near-ubiquitous devices, with smartphone ownership exceeding personal computer ownership, according to a recent survey. Here, we show the use of a smartphone-based augmented reality app, Augment, in a structural biology classroom exercise, which students installed independently without IT support. Post-lab attitudinal survey results indicate positive student experiences with this app. Based on our experiences, we suggest that smartphone-based molecular visualization software, such as that used in this exercise, is a powerful educational tool that is particularly well-suited for use in remote education.

Original languageEnglish (US)
Pages (from-to)448-451
Number of pages4
JournalBiochemistry and Molecular Biology Education
Volume48
Issue number5
DOIs
StatePublished - Sep 1 2020

Bibliographical note

Funding Information:
This work was supported by National Aeronautics and Space Administration Contract 80NSSC18K1139 under the Center for Origin of Life (to A.E.E. and K.P.A.). Experiments performed at the Itasca retreats were supported by the MCSB graduate program at the University of Minnesota. The 3D model of the Goldy Gopher statue used in Figure 1 ( https://sketchfab.com/3d-models/goldy-gopher-statue-b8f48abe6a534a3fb6f18a51a106d31d ) was generated by the Advanced Imaging Service for Objects and Spaces (AISOS) at the University of Minnesota and is used under a CC BY 4.0 license ( https://creativecommons.org/licenses/by/4.0/ ). We gratefully acknowledge the assistance of the University of Minnesota faculty, staff, and teaching assistants involved in the planning and execution of the Itasca retreats at which these experiments were performed. We thank members of the Engelhart and Adamala labs for helpful comments.

Funding Information:
This work was supported by National Aeronautics and Space Administration Contract 80NSSC18K1139 under the Center for Origin of Life (to A.E.E. and K.P.A.). Experiments performed at the Itasca retreats were supported by the MCSB graduate program at the University of Minnesota. The 3D model of the Goldy Gopher statue used in Figure 1 (https://sketchfab.com/3d-models/goldy-gopher-statue-b8f48abe6a534a3fb6f18a51a106d31d) was generated by the Advanced Imaging Service for Objects and Spaces (AISOS) at the University of Minnesota and is used under a CC BY 4.0 license (https://creativecommons.org/licenses/by/4.0/). We gratefully acknowledge the assistance of the University of Minnesota faculty, staff, and teaching assistants involved in the planning and execution of the Itasca retreats at which these experiments were performed. We thank members of the Engelhart and Adamala labs for helpful comments.

Publisher Copyright:
© 2020 International Union of Biochemistry and Molecular Biology

Keywords

  • computers in research and teaching
  • molecular visualization
  • web-based learning

PubMed: MeSH publication types

  • Journal Article
  • Research Support, Non-U.S. Gov't

Fingerprint

Dive into the research topics of 'Rapid deployment of smartphone-based augmented reality tools for field and online education in structural biology'. Together they form a unique fingerprint.

Cite this