The commercialization of drones has granted the public with unprecedented access to unmanned aviation. As such, the detection, tracking, and classification of drones in radars have become an area in high demand to mitigate accidental or voluntary misuse of these platforms. This paper focuses on the classification of drone targets in a safety context where the concept of Explainable AI is of particular interest. Here, we propose a simple, yet effective, means to extract a salient symmetry feature from the micro-Doppler signatures of drone targets, arising from onboard rotary components. Most importantly, this approach maintains the explainable nature of the employed recognition algorithm as the symmetry feature is directly related to the kinematics of the drones as the targets of interest. A large dataset collected from multiple locations with over 280 minutes of rotary and fixed wing drone flights has been collected and used to demonstrate the generalization capability of this approach.
|Name||IEEE National Radar Conference - Proceedings|
|Conference||2020 IEEE Radar Conference (RadarConf20)|
|Period||21/09/20 → 25/09/20|
- staring radar
- supervised learning