Date of Award

12-2024

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Aerospace, Physics, and Space Sciences

First Advisor

Madhur Tiwari

Second Advisor

Ryan T. White

Third Advisor

Eric D. Swenson

Fourth Advisor

Markus Wilde

Abstract

In recent years, On-Orbit Servicing (OOS) and Active Debris Removal (ADR) have attracted increasing interest due to growing concerns about space debris. This debris poses a significant risk to spacecraft, as collisions can catastrophically end missions and potentially trigger a cascading chain reaction of more debris. Space debris ranges from tiny paint chips to large non-functional spacecraft and even launch vehicle components. One way to mitigate the formation of additional space debris is to reduce the total number of large non-cooperative resident space objects present in operational orbits. This can be achieved through two approaches: de-orbiting these objects as part of ADR operations or, in the case of spacecraft, repairing them through OOS operations. Both strategies aim to reduce the threat posed by these objects to the space environment.

To support ADR and OOS, this research advances an overarching mission concept that uses a swarm of small satellites equipped with low size, weight, power, and cost (SWaP-C) hardware, such as cameras, to autonomously and collaboratively rendezvous with and capture non-cooperative RSOs, with the goal of de-tumbling and de-orbiting them. This dissertation research focuses on the first step of this mission concept: the identification and characterization of RSOs using lightweight algorithms capable of running on low SWaP-C hardware.

The document covers three main topics. The first topic explores using the convolutional neural network (CNN)-based object detection model You Only Look Once (YOLO) to detect features of interest, such as solar panels, antennas, body panels, and thrusters of unknown RSOs that would aid in rendezvous and proximity operations. The performance of various YOLO models, ranging from YOLOv5 to YOLOv8, is compared to identify the best-performing model based on object detection performance and computational efficiency.

The second topic implements the best-performing YOLO model in a hardware-in-the-loop testbed, demonstrating the proposed mission concept. The demonstration involves a swarm of chaser satellites, simulated by DJI RoboMaster Tello Talent quadcopters, to autonomously rendezvous with the detected capture features of a non-cooperative RSO identified by the selected object detector from the first topic. The results highlight a strong dependency on the object detector's performance. While CNN-based object detectors show promising and impressive performance, they also frequently fail to detect features that a human could easily identify using interpretability and contextual knowledge. For example, a large rectangular object projecting outward and blue in color is a solar panel. This lack of contextual understanding leads to misdetections and misclassifications, making CNN-based object detectors less reliable for safety-critical tasks such as this mission concept.

To address these shortcomings, the third topic introduces a novel, specialized spacecraft component detector called SpaceYOLO (SpY), which combines the strengths of CNNs and traditional computer vision techniques. SpY incorporates human-inspired reasoning and contextual knowledge to improve the object detector's performance. Unlike traditional CNN-based object detection, which directly outputs detected features from an input image, SpY first identifies primitive shapes in the image using a CNN object detector and then associates these shapes with different spacecraft features based on color and textural information. This approach improves interpretability and makes it context-aware. The ensemble of SpY with the selected YOLO model has demonstrated superior performance compared to other CNN-based object detectors evaluated in this study, enhancing safety and reliability for potential use in ADR and OOS tasks.

Available for download on Saturday, June 14, 2025

Share

COinS