




已阅读5页,还剩3页未读, 继续免费阅读
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
Onboard marker-less detection and localization of non-cooperating drones for their safe interception by an autonomous aerial system Matou s Vrba1, Daniel He rt1, and Martin Saska1 AbstractIn this paper, a novel approach to fast 3D local- ization of fl ying objects for their interception by a Micro Aerial Vehicle (MAV) is presented. The proposed method utilizes a depth image from a stereo camera to facilitate onboard detection of drones, fl ying in its proximity. The method does not rely on using any kind of markers, which enables localization of non-cooperating drones. This approach strongly relaxes the requirements on the drones to be detected, and the detection algorithm is computationally undemanding enough to process images online, onboard an MAV with limited computational resources. This allows using the detection system in the control feedback of an autonomous aerial intercepting system (AAIS). Output of the detection algorithm is fi ltered by a 3D multi- target tracking algorithm to reduce false positives, preserve temporal consistency of the detections, and to predict positions of the drones (e.g. to compensate camera and processing delays). We demonstrate the importance of the advances in fl ying object localization, presented in this paper, in an experiment with an intruder-interceptor scenario, which would be unfeasible using state-of-the-art detection and localization methods. I. INTRODUCTION Marker-less detection and relative localization of nearby fl ying aerial vehicles are topics, which are lately gaining interest of the robotic community 1, 2, 3, 4, 5, 6 to enable collision avoidance and cooperation of multi- robot systems, such as compact formations and swarms. The ability to detect fl ying robots without special markers placed on the target saves payload weight and space, which is usually very limited on MAVs fl ying in compact groups, and also enables localization of uncooperative robots. The option of localizing uncooperative and possibly malevolent fl ying vehicles is a vital precondition for realizing autonomous aerial intercepting systems (AAIS) targeted at preventing potential harmful behavior of such vehicles, either accidental or by a malicious intent. In light of the recent cases of MAVs intruding at civilian airports, rising fears of terrorist attacks at infrastructure using modifi ed consumer multicopters, etc., this is an increasingly interesting research challenge. Implementing a relative localization system onboard an MAV introduces multiple challenges. The generally limited carrying capability of MAVs means that the used equipment Manuscript received: February, 24th, 2019; Revised April, 25th, 2019; Accepted June, 26th, 2019. This paper was recommended for publication by Editor Jonathan Roberts upon evaluation of the Associate Editor and Reviewers comments. This work was supported by the Czech Science Foundation under research project No. 17-16900Y CTU in Prague, grant SGS17/187/13 and OP VVV funded project CZ.02.1.01/0.0/0.0/16 019/0000765 “Research Center for Informatics”. 1Authors arewiththeFacultyofElectricalEngineering, CzechTechnicalUniversityinPrague,Technick a2,Prague6, vrbamatofel.cvut.cz. Digital Object Identifi er (DOI): see top of this page. Fig. 1: Comparison of the same scene captured with (left to right) an RGB camera and a depth camera, and a processed depth image. The drone is marked with a red arrow. Note that the drone is hardly noticeable in the RGB image, whereas it can be easily detected in the processed depth image. should be light-weight and have minimal dimensions. The used algorithms have to respect the limited onboard process- ing power since the system needs to run online and with a fast rate to enable dynamic control feedback, which is important in all of the above mentioned robotic scenarios. Computer vision approaches are often used for solving this challenge, such as in 1, 2, 3, 7, 8, 9, because a camera is a light, small and easily available sensor, but robustness, precision, and speed are their limiting factors. Drone detection using a camera, which is placed on another aerial vehicle, is a hard problem in comparison with general object detection. The drones to be detected often appear small in the image and with a structured background they can be hard to recognize even for a human expert (see Fig. 1). In this paper, we introduce a novel method of marker- less relative localization of drones, which utilizes a depth image from a stereo camera onboard an AAIS. The presented method is fast, precise and robust, as we demonstrate by verifying the system on an MAV platform designed for the intruder interception scenario (as described in section IV). A. Related work Although several drone interception systems are available, such as the SkyWall1or 12, most of the solutions are intended to catch a stationary fl ying drone during a manually controlled fl ight, but autonomous interception is not tackled. The systems, that aim to involve autonomous detection, often apply a RADAR to the problem of drone detection and local- ization, such as in 13, 14, 15. Similarly, 16 presents a system, combining a RADAR and LiDAR for improved precision. However, these systems focus on detection at long ranges using large stationary ground sensors, and do not address precise onboard close-range localization, which is needed for autonomous physical elimination of drones. There are many readily available marker-based relative localization systems, which have successfully been used IEEE Robotics and Automation Letters (RAL) paper presented at the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Macau, China, November 4-8, 2019 Copyright 2019 IEEE Detection and localization of target drones 3D tracking and position prediction of target drones Interception mission control state machine Pursuit trajectory planning (MPC) Self localization and low-level control of interceptor MAV Interceptor MAV motors Interceptor MAV net cannon Input depth image from stereo camera Image processing and detection of isolated 3D components 3D position tracking and prediction Fig. 2: Structure of the proposed autonomous aerial intercepting system (left). This paper deals with the components, marked with the red dashed line (see detail on the right). Elements in the green dashed line were solved in our prior work 10, 11. with MAVs, such as 7, 8, 9, but in this work we are focused on marker-less detection and localization only, for obvious reasons. To tackle this problem, a commonly applied method is detecting motion discrepancies in a video. These approaches are based on an assumption that the background moves roughly uniformly between frames, whereas moving objects in the image break this uniformity and thus can be detected. A general method for detecting moving objects in an image from a camera, placed on a fl ying MAV, is described in 17. The algorithm runs online, onboard the MAV, and is able to take into account differences between image frames, caused by known movement of the camera. In 1, the authors present a combination of background subtraction, optical fl ow-based moving object detection, and classifi cation to detect MAVs, and a 2D tracking algorithm to reduce false negatives. Another variation of this approach is presented in 2, where the authors used motion stabilization and spatio-temporal image cubes classifi cation to detect aircraft from a video. Neither of these approaches addresses 3D localization, which is crucial for realization of an AAIS. A combined method utilizing AdaBoost-based detection, visual tracking and a Gaussian-Mixture Probability Density Filter is used in 3 for MAV relative localization. The authors demonstrate viability of this approach in an indoor experiment, based on which they conclude, that the visual tracking is needed to complement the relatively slow detec- tion. Similarly, 4 presents a combination of template match- ing for detection and visual tracking in a relative localization framework with possibility of integrating known navigation data between cooperating MAVs for better performance. In 18, a method for marker-less indoor MAV localization using a static stereo camera is presented. The MAVs are detected separately in each image using background subtrac- tion and image segmentation, making use of an assumption of a static camera. The detection disparity is then used to calculate the corresponding distance and 3D position. Convolutional Neural Networks (CNNs) have recently gained much popularity for object recognition in images. In 5, a performance comparison of different standard CNN models, applied to drone detection in images, is provided. An approach using a custom CNN classifi er for detecting MAVs is presented in 6, with a focus on discriminating between MAVs and birds on a sky background. We have implemented a CNN-based detector during development of the presented AAIS platform, using the state-of-the-art CNN object detec- tor YOLOv2 19, adapted to the specifi c application. This detector was experimentally verifi ed (see section IV), but its performance was deemed to be insuffi cient when compared to the solution, presented in this paper. B. Contributions Most of the related works rely on offl ine image processing or do not achieve framerates required for control feedback in such highly dynamical robotic scenarios as is autonomous aerial interception. Moreover, state-of-the-art approaches of- ten neglect the three-dimensional localization of the targets, only focusing on 2D detection or tracking. Approaches for 3D localization mostly use size of the detected object in the image to estimate distance, which is unreliable and depends on a known real size of the object. Our approach addresses these problems. It is fast enough to run online, onboard an MAV without the need for visual tracking to keep track of targets between detections, and it offers full 3D position estimation, tracking and prediction of the detected drones. Unlike detection methods, relying on any sort of pre-trained model (such as the CNN based methods), our approach is independent of the visual appearance of the drones to be localized, and thus it can be used without any prior knowledge about the type of the target drones. The presented detection and localization algorithm is fast and robust enough to be incorporated into our autonomous MAV interception system in a feedback loop with the MPC controller of the AAIS platform (see Fig. 2). The system is able to intercept a fast fl ying and actively evading MAV (see section IV-B), which has not been presented in the robotic community so far, according to our knowledge. Based on our experiments, such system is not viable using state-of-the-art MAV detection and localization methods. II. AUTONOMOUS AERIAL INTERCEPTING SYSTEM A. Problem description We presume one MAV carrying an onboard stereo cam- era providing depth images. The transformations between positions of the camera, corresponding to two consecutive depth images, must be known with suffi cient precision for the localization algorithm. The stereo camera can also be mounted on a ground robot or a static base. An unspecifi ed number of drones to be localized is pre- sumed to be fl ying in the area, visible by the stereo camera. These drones are presumed to have a shape, which can be represented as a compact set of pixels in the depth image. In this paper, we assume that the AAIS is navigated into an area of expected presence of an intruder drone by an external Input depth image Pre-processed depth image Pre-processing Set of binary images Thresholding Set of contours for each binary image Contour detection and fi ltering 3D positions and corresponding covariances Contour grouping and projection to 3D Fig. 3: Steps of the algorithm for detecting isolated 3D components from a depth image. drone detection and localization system. In particular, the proposed AAIS is designed to work together with an anti- drone system (such as the DeDrone system2, which is used in our case), which frequently updates position of the intruder drone using multiple sensors. Although the precision of such external localization (1020m) is not suffi cient for direct aerial interception, it enables the AAIS platform to get close enough to the target for onboard localization. B. System overview The proposed AAIS system is an autonomous multirotor aerial vehicle, consisting of several cooperating elements, which are shown in Fig. 2, left. A drone detection and localization system produces candidates for 3D positions of intruding drones for a 3D tracking and prediction algorithm. Filtered positions of intruding drones are an output of the 3D tracking and prediction algorithm, and are provided to a mission control state machine. The state machine controls a setpoint of a pursuit trajectory planner and a trigger of an onboard net cannon. The desired trajectory ensures that the AAIS platform reaches a suitable position, relative with the predicted position of the intruding drone, as fast as dynamics of the AAIS MAV allow. This position is specifi ed based on system analyses and experiments with the onboard net cannon and it guarantees that the intruder is reliably captured by the net. Once the suitable relative position with respect to the intruder is reached, the mission control state machine triggers the net cannon. The pursuit trajectory is planned by a model predictive control (MPC) planner, which takes into account the AAIS MAV dynamics. A low-level SO(3) controller in close feedback with a self-localization system of the MAV stabilizes and controls the MAV position according to the planned trajectory. The algorithm for 3D tracking and prediction of multiple target positions utilizes data from the self-localization system to compensate movement of the stereo camera between consecutive frames. In this paper, we are focusing on the most challenging part of the AAIS system, which is the detection, localization, 3D tracking and position prediction of the intruding drones. The rest of the system was adapted from our prior work in MAV control and planning, and its description is omitted here be- cause of the limited scope of this paper. Readers, interested in the system control (the block ”Pursuit trajectory planning” in the AAIS scheme in Fig. 2) may continue in 10, where the MPC-based system was successfully applied for multirotor MAV landing on a fast moving ground vehicle. For AAIS, the system was extended for pursuing targets moving in 3D, and for enabling manoeuvres, that allow effi cient usage of the net launcher. For MAV state estimation and low-level control (the block ”Self localization and low-level control of interceptor MAV”), we refer to 11, where foundations of the system were designed for a treasure hunt challenge of the MBZIRC competition3. The precision and reliability of the state estimator and SO(3) position controller were the key properties in MBZIRC (our system provided the best performance among 147 registered teams), and they are vital also for implementation of an AAIS. For AAIS, the method in 11 was extended to allow highly dynamic manoeuvres, be able to absorb the shock during releasing of the net and to integrate the external drone localization system. The result is a working AAIS solution, as is demonstrated in the experiments (see section IV), which to our knowledge is the fi rst fully autonomous aerial intercepting system, published in the scientifi c community. III. NON-COOPERATING DRONE LOCALIZATION The proposed method for marker-less relative localization of drones consists of two main parts (see Fig. 2, right): detection of isolated 3D components in a depth image, 3D tracking and position prediction of multiple objects. Input of the detection algorithm is a depth image, which is pre-processed, and candidates for drones are detected. The candidates are selected using thresholding of the pre- processed image at different depths and contour extraction, fi ltering and grouping. These candidates are then processed by the second stage of the localization, which keeps a set of multiple potential drone tracks in the area. The candidates are associated with the tracks and used as inputs for a Kalman fi lter, which updates the tracks. Tracks with a high confi dence are an output of the localization system. The two parts are described in detail in the sections III-A and III-B. A. Detection of isolated 3D components in a depth image An algorithm for detecting drones from a depth image, obtained onboard of AAIS, is presented in this section. The 3mrs.felk.cvut.cz/mbzirc Fig. 4: Example binary images thresholded from one pre- processed depth image (which is in Fig. 1) using threshold distances (from left to right) 3m, 6m, 9m and 12m. Pixels that are closer than the threshold are white, and further pixels are black. Detected contours are highlighted as red. motivational idea behind this algorithm is simple. We assume that a fl ying drone is not connected to other objects in the 3D space (otherwise it would be considered a physical collision), which is a unique feature of a fl ying object in contrast to other objects in the scene. Therefore if the algorithm is able to detect isolated objects in 3D, it can generate candidates of projections of drones onto the image. To achieve this, the algorithm leverages the 3D information of the depth image. Individual steps of the detection algorithm are described in this section and visualized in Fig. 3. All parameters of the detection algorithm are listed in Table I. Parameter values, used in verifi cation experiments, have been chosen empiri- cally based on simulations and real-world deployment. In the fi rst step, the raw input depth image is pre-processed by applying morphological erosion to the depth image, which is interpreted as a gray-scale image where a darker color corres
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025福建三明市清流县金星园建设发展有限公司招聘消防驾驶员2人考前自测高频考点模拟试题及1套完整答案详解
- 2025广西贺州市中小学(幼儿园)教师公开招聘更正岗位计划表相关的考前自测高频考点模拟试题及答案详解(各地真题)
- 2025年铜陵市安徽陵港控股有限公司招聘5人考前自测高频考点模拟试题附答案详解(完整版)
- 2025年河北衡水市第三人民医院招聘见习人员49名模拟试卷及一套完整答案详解
- 2025年甘肃省兰州市西北师范大学诚聘海内外高层次人才考前自测高频考点模拟试题附答案详解(考试直接用)
- 2025春季中国太平社会招聘模拟试卷及答案详解(网校专用)
- 2025年宣城广德市国有资产投资经营有限公司下属公司招聘11人考前自测高频考点模拟试题及一套参考答案详解
- 2025陕西宝鸡市中医医院合同制护理人员招聘40人考试参考试题及答案解析
- 2025年国贸物业保安考试题及答案
- 2025年重庆公开遴选真题及答案
- 人教版(2024)八年级上册英语Unit 4 Amazing Plants and Animals 教案
- 全科医师外科规培体系
- 留置导尿考试试题及答案
- 静脉高营养治疗
- 肉毒素知识培训课件
- 最终版附件1:“跨学科主题学习”教学设计(2025年版)
- 2025年春新北师大版数学七年级下册课件 第四章 三角形 问题解决策略:特殊化
- 2024年1月版安全环境职业健康法律法规标准文件清单
- 凉菜岗位职责
- 药学本科毕业论文范文
- 锅炉节能器施工方案
评论
0/150
提交评论