Sharing space in mixed and virtual reality environments using a low-cost depth sensor

Evan Suma Rosenberg, David M. Krum, Mark Bolas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Scopus citations

Abstract

We describe an approach for enabling people to share virtual space with a user that is fully immersed in a head-mounted display. By mounting a recently developed low-cost depth sensor to the user's head, depth maps can be generated in real-time based on the user's gaze direction, allowing us to create mixed reality experiences by merging real people and objects into the virtual environment. This enables verbal and nonverbal communication between users that would normally be isolated from one another. We present the implementation of the technique, then discuss the advantages and limitations of using commercially available depth sensing technology in immersive virtual reality applications.

Original languageEnglish (US)
Title of host publicationISVRI 2011 - IEEE International Symposium on Virtual Reality Innovations 2011, Proceedings
Pages349-350
Number of pages2
DOIs
StatePublished - Jun 2 2011
Externally publishedYes
EventIEEE International Symposium on Virtual Reality Innovations 2011, ISVRI 2011 - Singapore, Singapore
Duration: Mar 19 2011Mar 20 2011

Other

OtherIEEE International Symposium on Virtual Reality Innovations 2011, ISVRI 2011
Country/TerritorySingapore
CitySingapore
Period3/19/113/20/11

Keywords

  • HMDs
  • depth-sensing cameras
  • mixed reality

Fingerprint

Dive into the research topics of 'Sharing space in mixed and virtual reality environments using a low-cost depth sensor'. Together they form a unique fingerprint.

Cite this