Real-time auditory-visual distance rendering for a virtual reaching task

Luca Mion, Federico Avanzini, Bruno Mantel, Benoit Bardy, Thomas A. Stoffregen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.

Original languageEnglish (US)
Title of host publicationProceedings - VRST 2007, ACM Symposium on Virtual Reality Software and Technology
Pages179-182
Number of pages4
DOIs
StatePublished - Dec 1 2007
EventACM Symposium on Virtual Reality Software and Technology, VRST 2007 - Newport Beach, CA, United States
Duration: Nov 5 2007Nov 7 2007

Publication series

NameProceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST

Other

OtherACM Symposium on Virtual Reality Software and Technology, VRST 2007
CountryUnited States
CityNewport Beach, CA
Period11/5/0711/7/07

Keywords

  • 3-D sound
  • egocentric distance
  • multimodal interaction
  • virtual auditory space

Fingerprint Dive into the research topics of 'Real-time auditory-visual distance rendering for a virtual reaching task'. Together they form a unique fingerprint.

Cite this