A deep learning approach for optical autonomous planetary relative terrain navigation

Tanner Campbell, Roberto Furfaro, Richard Linares, David Gaylor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations which does not have any definitive answer. There are many techniques to help cope with this issue using both passive and active sensors, but almost all require very sophisticated dynamical models. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation by directly mapping an image to position. The portability of trained CNNs allows offline training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition.

Original languageEnglish (US)
Title of host publicationSpaceflight Mechanics 2017
EditorsJon A. Sims, Frederick A. Leve, Jay W. McMahon, Yanping Guo
PublisherUnivelt Inc.
Pages3293-3302
Number of pages10
ISBN (Print)9780877036371
StatePublished - 2017
Event27th AAS/AIAA Space Flight Mechanics Meeting, 2017 - San Antonio, United States
Duration: Feb 5 2017Feb 9 2017

Publication series

NameAdvances in the Astronautical Sciences
Volume160
ISSN (Print)0065-3438

Other

Other27th AAS/AIAA Space Flight Mechanics Meeting, 2017
CountryUnited States
CitySan Antonio
Period2/5/172/9/17

Bibliographical note

Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.

Fingerprint Dive into the research topics of 'A deep learning approach for optical autonomous planetary relative terrain navigation'. Together they form a unique fingerprint.

Cite this