Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations which does not have any definitive answer. There are many techniques to help cope with this issue using both passive and active sensors, but almost all require very sophisticated dynamical models. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation by directly mapping an image to position. The portability of trained CNNs allows offline training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition.
|Original language||English (US)|
|Title of host publication||Spaceflight Mechanics 2017|
|Editors||Jon A. Sims, Frederick A. Leve, Jay W. McMahon, Yanping Guo|
|Number of pages||10|
|State||Published - 2017|
|Event||27th AAS/AIAA Space Flight Mechanics Meeting, 2017 - San Antonio, United States|
Duration: Feb 5 2017 → Feb 9 2017
|Name||Advances in the Astronautical Sciences|
|Other||27th AAS/AIAA Space Flight Mechanics Meeting, 2017|
|Period||2/5/17 → 2/9/17|
Copyright 2017 Elsevier B.V., All rights reserved.