Date of Award

12-2021

Document Type

Thesis

Degree Name

Master of Science (MS)

Department

Aerospace, Physics, and Space Sciences

First Advisor

Markus Wilde

Second Advisor

Ryan White

Third Advisor

Brian Kish

Fourth Advisor

David Fleming

Abstract

With the increasing risk of collisions with space debris and the growing interest in on-orbit servicing, the ability to autonomously capture non-cooperative, tumbling target objects remains an unresolved challenge. This thesis provides an autonomous and artificial intelligence solution to either inspect, avoid, or perform on-orbit spacecraft servicing of an uncooperative resident space object (RSO). The solution is built on the fundamentals of Convolutional Neural Networks (CNN), which is used to classify the four most targeted features of a spacecraft essential for docking and collision avoidance during rendezvous, such as solar panels, antennas, spacecraft bodies, and thrusters. The solution was then altered into an object detection algorithm to classify and localize the four features using You Only Look Once V5 (YOLOv5) and Faster Region-based Convolutional Neural Networks (Faster R-CNN). The weights obtained from training these algorithms on the spacecraft image dataset were tested on videos obtained using a spacecraft motion dynamics and orbital lighting simulator to evaluate the performance of classification and detection. Each test video case entailed different yaw-pitch motions of the chaser and target spacecraft with varying lighting conditions. The results shown in this thesis demonstrates that the proposed method of using a vision-based approach is a viable solution for navigation.

Comments

Copyright held by author

Share

COinS