Deep learning-based strategies for the detection and tracking of drones using several cameras is a paper by Eren Unlu, Emmanuel Zenou, Nicolas Rivere, and Paul-Edouard Dupouy)
In the domain of drone detection, computer vision has emerged as a robust solution compared to alternatives like RADAR, acoustics, and RF signal analysis. Among computer vision-based methods, deep learning algorithms have gained popularity due to their effectiveness. This paper introduces an autonomous system for drone detection and tracking, utilizing a static wide-angle camera along with a lower-angle camera mounted on a rotating turret.
Publication Date: 2019
Deep learning-based strategies for the detection and tracking of drones using several cameras contains the following major sections:
- Introduction
- Drone detection and tracking
- Main architecture of the system
- Detection on multiple overlaid images with a single architecture
- Experimental results
- Conclusion
Open Access Paper. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
C-UAS Hub does not own this content and provides a link for users at the bottom of the page to access it in its original location. This allows the author(s) to track important article metrics related to their work. All credit goes to its rightful owner.
Post Image- Fig. 5 – Proposed montaged multiple image detection. The frame coming from the zoomed camera is rescaled and overlaid on the main image plane. The position of this overlay is determined according to presence of tracks. We try to enforce to overlay the new image, as far as possible from existing tracks, in order to allow the system to continue tracking. (Image Credit: Deep learning-based strategies for the
detection and tracking of drones using several cameras authors)
For additional multimedia resources, please visit the Multimedia Library.