TY - GEN
T1 - Unreal Success: Vision-Based UAV Fault Detection and Diagnosis Framework
AU - de Alvear Cardenas, J.I.
AU - de Visser, C.C.
PY - 2024
Y1 - 2024
N2 - Online fault detection and diagnosis (FDD) enables Unmanned Aerial
Vehicles (UAVs) to take informed decisions upon actuator failure during
flight, adapting their control strategy or deploying emergency systems.
Despite the camera being a ubiquitous sensor on-board of most commercial
UAVs, it has not been used within FDD systems before, mainly due to the
nonexistence of UAV multi-sensor datasets that include actuator failure
scenarios. This paper presents a knowledge-based FDD framework based on
a lightweight LSTM network and a single layer neural network classifier
that fuses camera and Inertial Measurement Unit (IMU) information.
Camera data are pre-processed by first computing its optical flow with
RAFT-S, a state-of-the-art deep learning model, and then extracting
features with the backbone of MobileNetV3-S. Short-Time Fourier
Transform is applied on the IMU data for obtaining their time-frequency
information. For training and assessing the proposed framework, UUFOSim
was developed: an Unreal Engine-based simulator built on AirSim that
allows the collection of high-fidelity photo-realistic camera and sensor
information, and the injection of actuator failures during flight. Data
were collected in simulation for the Bebop 2 UAV with 16 failure cases.
Results demonstrate the added value of the camera and the complementary
nature of both sensors with failure detection and diagnosis accuracies
of 99.98% and 98.86%, respectively.
AB - Online fault detection and diagnosis (FDD) enables Unmanned Aerial
Vehicles (UAVs) to take informed decisions upon actuator failure during
flight, adapting their control strategy or deploying emergency systems.
Despite the camera being a ubiquitous sensor on-board of most commercial
UAVs, it has not been used within FDD systems before, mainly due to the
nonexistence of UAV multi-sensor datasets that include actuator failure
scenarios. This paper presents a knowledge-based FDD framework based on
a lightweight LSTM network and a single layer neural network classifier
that fuses camera and Inertial Measurement Unit (IMU) information.
Camera data are pre-processed by first computing its optical flow with
RAFT-S, a state-of-the-art deep learning model, and then extracting
features with the backbone of MobileNetV3-S. Short-Time Fourier
Transform is applied on the IMU data for obtaining their time-frequency
information. For training and assessing the proposed framework, UUFOSim
was developed: an Unreal Engine-based simulator built on AirSim that
allows the collection of high-fidelity photo-realistic camera and sensor
information, and the injection of actuator failures during flight. Data
were collected in simulation for the Bebop 2 UAV with 16 failure cases.
Results demonstrate the added value of the camera and the complementary
nature of both sensors with failure detection and diagnosis accuracies
of 99.98% and 98.86%, respectively.
UR - http://www.scopus.com/inward/record.url?scp=85192226265&partnerID=8YFLogxK
U2 - 10.2514/6.2024-0760
DO - 10.2514/6.2024-0760
M3 - Conference contribution
SN - 9781624107115
T3 - AIAA SciTech Forum and Exposition, 2024
BT - Proceedings of the AIAA SCITECH 2024 Forum
PB - American Institute of Aeronautics and Astronautics Inc. (AIAA)
T2 - AIAA SCITECH 2024 Forum
Y2 - 8 January 2024 through 12 January 2024
ER -