Modeling the human visuo-motor system to support remote-control operation

Jonathan Andersh, Bérénice Mettler

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

The working hypothesis in this project is that gaze interactions play a central role in structuring the joint control and guidance strategy of the human operator performing spatial tasks. Perceptual guidance and control is the idea that the visual and motor systems form a unified perceptuo-motor system where necessary information is naturally extracted by the visual system. As a consequence, the response of this system is constrained by the visual and motor mechanisms and these effects should manifest in the behavioral data. Modeling the perceptual processes of the human operator provides the foundation necessary for a systems-based approach to the design of control and display systems used by remotely operated vehicles. This paper investigates this hypothesis using flight tasks conducted with remotely controlled miniature rotorcraft, taking place in indoor settings that provide rich environments to investigate the key processes supporting spatial interactions. This work also applies to spatial control tasks in a range of application domains that include tele-operation, gaming, and virtual reality. The human-in-the-loop system combines the dynamics of the vehicle, environment, and human perception–action with the response of the overall system emerging from the interplay of perception and action. The main questions to be answered in this work are as follows: (i) what is the general control and guidance strategy of the human operator, and (ii) how is information about the vehicle and environment extracted visually by the operator. The general approach uses gaze as the primary sensory mechanism by decoding the gaze patterns of the pilot to provide information for estimation, control, and guidance. This work differs from existing research by taking what have largely been conceptual ideas on action–perception and structuring them to be implemented for a real-world problem. The paper proposes a system model that captures the human pilot’s perception–action loop; the loop that delineates the main components of the pilot’s perceptuo-motor system, including estimation of the vehicle state and task elements based on operator gaze patterns, trajectory planning, and tracking control. The identified human visuo-motor model is then exploited to demonstrate how the perceptual and control functions system can be augmented to reduce the operator workload.

Original languageEnglish (US)
Article number2979
JournalSensors (Switzerland)
Volume18
Issue number9
DOIs
StatePublished - Sep 6 2018

Bibliographical note

Funding Information:
Funding: This research work was made possible thanks to the financial support from the National Science Foundation (CMMI-1002298 and Career Grant CMMI-1254906 ) and the Office of Naval Research (Grant 11361538).

Publisher Copyright:
© 2018 by the authors. Licensee MDPI, Basel, Switzerland.

Keywords

  • Human–Machine interface
  • Teleoperation
  • Visuo-motor

Fingerprint

Dive into the research topics of 'Modeling the human visuo-motor system to support remote-control operation'. Together they form a unique fingerprint.

Cite this