Cooperative vision-aided inertial navigation using overlapping views

Igor V. Melnyk, Joel A. Hesch, Stergios I. Roumeliotis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

28 Scopus citations

Abstract

In this paper, we study the problem of Cooperative Localization (CL) for two robots, each equipped with an Inertial Measurement Unit (IMU) and a camera. We present an algorithm that enables the robots to exploit common features, observed over a sliding-window time horizon, in order to improve the localization accuracy of both vehicles. In contrast to existing CL methods, which require robot-to-robot distance and/or bearing measurements to resolve the robots' relative position and orientation (pose), our approach recovers the relative pose through indirect information from the commonly observed features. Moreover, we analyze the system observability properties to determine how many degrees of freedom (d.o.f.) of the relative transformation can be computed under different measurement scenarios. Lastly, we present simulation results to evaluate the performance of the proposed method.

Original languageEnglish (US)
Title of host publication2012 IEEE International Conference on Robotics and Automation, ICRA 2012
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages936-943
Number of pages8
ISBN (Print)9781467314039
DOIs
StatePublished - 2012
Externally publishedYes
Event 2012 IEEE International Conference on Robotics and Automation, ICRA 2012 - Saint Paul, MN, United States
Duration: May 14 2012May 18 2012

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Other

Other 2012 IEEE International Conference on Robotics and Automation, ICRA 2012
Country/TerritoryUnited States
CitySaint Paul, MN
Period5/14/125/18/12

Fingerprint

Dive into the research topics of 'Cooperative vision-aided inertial navigation using overlapping views'. Together they form a unique fingerprint.

Cite this