Drone Detection and Tracking System Based on Fused Acoustical and Optical Approaches is a paper by Siyi Ding, Xiao Guo, Ti Peng, Xiao Huang, and Xiaoping Hong.

With the growing prevalence of small drones, the demand for a robust drone-focused surveillance system operational day and night has become pressing. To address this, we introduce a novel system known as the Multimodal Unmanned Aerial Vehicle 3D Trajectory Exposure System (MUTES), which leverages acoustic and optical sensor fusion for detecting and tracking drone targets.

MUTES integrates various sensor modules, including a 64-channel microphone array, camera, and lidar. The microphone array offers semispherical coverage with exceptional sound source estimation signal-to-noise ratio. Simultaneously, the long-range lidar and telephoto camera provide precise localization within a narrower yet highly defined field of view. Employing a strategic approach, MUTES combines passive-to-active localization and a coarse-to-fine strategy to achieve extensive semispherical detection and meticulous 3D tracking accuracy.

To enhance fidelity, an environmental denoising model is employed. This model effectively isolates valid acoustic features from drone targets, surmounting limitations associated with conventional sound source localization methods when confronted with noise interference. Our proposed sensor fusion methodology is validated through field experiments. MUTES emerges as a pioneering solution, offering unparalleled detection range, superior 3D positional precision, robust anti-interference capabilities, and cost-effectiveness for countering unverified drone intrusions, representing a notable advancement in the domain.

Publication Date- July 2023

Drone Detection and Tracking System Based on Fused Acoustical and Optical Approaches contains the following major sections:

  • Introduction
  • Related Work
  • System Overview
  • Detection Scheme
  • Experimental Evaluation
  • Conclusion

This open-access article is distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

C-UAS Hub does not own this content and provides a link for users at the bottom of the page to access it in its original location. This allows the author(s) to track important article metrics related to their work. All credit goes to its rightful owner.

For additional multimedia resources, please visit the Multimedia Library.

Post Image- The overview diagram of the drone detection scheme. a) A demonstration of the outdoor experimental scenario: the detection system is deployed in the center of a playground with a drone flying above the field. b) Field of view (FOV) for each sensor component utilized in the detection system. c) The timeline of the fusion strategy in the target-tracking Kalman filter. d) Direction estimation result obtained from the component of the acoustic array. e) Final 3D trajectory estimation obtained from the fused system. (Image Credit: Authors)