论文2G精华版本 ICRA 2018 files 1704_第1页
论文2G精华版本 ICRA 2018 files 1704_第2页
论文2G精华版本 ICRA 2018 files 1704_第3页
论文2G精华版本 ICRA 2018 files 1704_第4页
论文2G精华版本 ICRA 2018 files 1704_第5页
免费预览已结束,剩余1页可下载查看

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

Autonomous Bio Inspired Small Object Detection and Avoidance Michael Ohradzansky1 Hector E Alvarez2 Jishnu Keshavan3 Badri N Ranganathan3and J Sean Humbert3 Abstract Small object detection and avoidance in unknown environments is a signifi cant challenge to overcome for small autonomous vehicles that are generally highly agile and re stricted in payload and computational processing power Typical machine vision and range measurement based solutions suffer either from restricted fi elds of view or signifi cant computational complexity and are not easily portable to small platforms In this paper a novel bio inspired navigation technique is introduced that is modeled using analogues of the small fi eld motion sensitive interneurons of the insect visuomotor system The proposed technique achieves small fi eld object detection based on Fourier residual analysis of instantaneous optic fl ow The small fi eld signal is used to extract relative range and bearing of the nearest obstacle which is then combined with an artifi cial potential function based low order steering control law The proposed sensing and control scheme is experimentally validated with a quadrotor vehicle that is able to effectively navigate an unknown environment laden with small fi eld clutter This bio inspired approach is computationally effi cient and serves as a robust refl exive solution to the problem of small object detection and avoidance for autonomous robots I INTRODUCTION A UTONOMOUS aerial microsystems are expected to be deployed in multiple applications such as search and rescue exploration and urban navigation However signifi cant restrictions in size weight and power make it diffi cult for these platforms to accomplish safe navigation in unknown environments characterized by obstacles of different shapes sizes and texture Current approaches to the problem of autonomous obstacle detection and avoidance rely on the use of either range measurement systems such as LIDAR Light Detection and Ranging 1 2 3 4 or monocular and stereo camera systems implementing machine vision techniques 5 6 7 Both sensing techniques suffer from intrinsic limita tions which make it diffi cult to deploy on a small scale platform with a payload of a few grams For instance a customized Kalman fi lter called SAMWISE is implemented in 7 that fuses measurements from inertial measurement units LIDAR GPS Global Positioning System vision magnetometer barometer and laser altimeter sensors in order to accomplish fast autonomous fl ights through a 60 m long obstacle course However scaling of these different sensors down to micro sized fl ying vehicles poses signifi cant diffi culty Additionally these studies rely on sensors with limited fi elds of view and require a ground station to carry out computationally intensive processing which limits their use 1Department of Aerospace Engineering Sciences University of Colorado Boulder CO mioh0887 colorado edu 2 Scientifi c Systems Company Inc Woburn MA 3 Department of Mechanical Engineering University of Colorado Boul der CO Hence there is a need to develop computationally effi cient perception strategies that preferably rely on omnidirectional sensors for deployment on aerial microsystems Biologically inspired approaches based on observation and analysis of insect behavior offer an alternative paradigm for addressing this problem Flying insects have existed for hundreds of millions of years and are living demonstrations of small fast and agile systems capable of accomplishing complex navigational tasks using computationally effi cient biological sensing and feedback control circuits Insects are known to use optic fl ow 8 which is the characteristic pattern of motion that forms on the retina of the fl y as it moves and encodes motion cues such as relative proximity and velocity to objects in the environment for demonstrating different behaviors such as collision avoidance 9 and land ing 10 Experiments demonstrating the centering response of honeybees was shown in 11 that relied on minimization of lateral optic fl ow asymmetry Furthermore honeybees regulated forward speed by maintaining a fi xed average optic fl ow value during fl ight resulting in slowing down while navigating through narrow tapered corridors The lobula plate in the insect eye contains different sets of motion sensitive neurons prominently those sensitive to horizontal motion such as HS H1 H2 CH and FD cells 12 and those sensitive to vertical motion such as the VS 13 These specialized neurons process incoming optic fl ow patterns to extract behaviorally relevant cues for navigation 14 and feed into different motor neurons thereby directly affecting fl ight response 15 16 The HS H1 and H2 cells are known to spatially integrate stimulus from a wide fi eld of view and are responsible for actuation commands in response to background motion and wide objects There have been several attempts at creating engineering analogues of this wide fi eld spatial integration aspect for demonstrating centering in corridors and object avoidance in ground and aerial vehicles 17 18 19 20 21 A signifi cant advantage of using the optic fl ow approach is that it lends itself to being realized in hardware using analog VLSI techniques which can provide outputs at rates as high as 1 kHz while consuming only micro watts of power 22 The role of the fi gure detection FD and centrifugal horizontal CH neurons have been explored specifi cally in the context of small object detection 23 24 While the response to wide fi eld motion patterns is understood through the technique of spatial integration the processing for small fi eld detection is not as straightforward The experiments in 23 25 highlight the different components involved in the small fi eld processing circuitry It was found that the FD cells in particular are tuned to respond preferentially to small fi eld 2018 IEEE International Conference on Robotics and Automation ICRA May 21 25 2018 Brisbane Australia 978 1 5386 3080 8 18 31 00 2018 IEEE3442 objects in motion 23 The CH cells receive ipsilateral visual motion information via dendro dendritic synapses from HS cells The input that is received through this dendro dendritic synapse creates a spatial blurring effect of the motion image 25 26 which is then inhibited from the input to the FD neurons resulting in the sharpening of the original stimulus and emphasizing the small object information This interaction of the various elements of small fi eld processing is depicted in Fig 1 adapted from 24 Fig 1 Interaction of different neurons for small fi eld detection adapted from 24 There are no known engineering attempts that repli cate this small fi eld neuronal processing strategy The main contribution of this work is the demonstration of a computationally effi cient method for small object perception and avoidance through the creation of an engineering ana logue for the FD neuron system which can be deployed on small autonomous aerial microsystems for indoor and outdoor applications The engineering equivalent of the CH cells obtains the low spatial frequency component of the input stimulus which is then removed from the stimulus to the FD cells The resultant operation retains the high spatial frequency component of the stimulus which directly encodes information about the relative range and bearing of small fi eld objects in the local environment These motion cues are then extracted and combined with an artifi cial potential function based steering controller to achieve safe robust navigation in an obstacle fi eld laden with small fi eld clutter The proposed navigation strategy then relies entirely on the extraction of the small fi eld optic fl ow signal and thus renders extraction of either local environment structure or estimation of vehicle velocity states superfl uous Thus the proposed scheme is inherently more robust to variation in local environment structure and reference fl ight condition The organization of the paper is as follows Section II fi rst introduces the technique of spatial decomposition of planar optic fl ow signals for environments with wide and small fi eld objects The mathematical description of the procedure for small fi eld signal extraction and its use in a steering controller is presented subsequently Section III provides a detailed description of the experimental setup which includes details of the vehicle and vision sensor hardware deployed Section IV provides results and discussion of the navigation experiments that validate the proposed sensing and control scheme Section V presents the conclusions from this study II SMALL OBJECT AVOIDANCE USING PLANAR OPTIC FLOW In this section a novel approach of small fi eld decom position of planar optic fl ow is presented for extraction of relevant motion cues for navigation in unstructured environ ments laden with small fi eld clutter The structure of the insect visuomotor pathway provides inspiration for using this technique of processing instantaneous optic fl ow patterns from planar imaging surfaces Specifi cally in this study an engineering equivalent of the FD cell processing of measured optic fl ow is considered for small fi eld detection The small fi eld structure in the local environment induces a high frequency spatial component of instantaneous optic fl ow that can be mathematically approximated as the residual of an appropriate spatial Fourier series A Spatial decomposition of planar optic fl ow Planar optic fl ow can be approximated as the relative velocity vector of material points in the surrounding envi ronment projected into the tangential space of the circular imaging surface In a stationary environment it is a function of observer rotational and translational motion together with relative proximity to surrounding objects If the spatial distribution of objects in the environment is modeled as a continuous function of the body referenced viewing angle the optic fl ow fi eld can be written as Q x d q 1 usin v cos 1 where d q is the radial distance to the nearest point in the visual fi eld at angle u and v are body referenced forward and lateral velocity components respectively x q q is the vehicle state with q x y as the vehicle pose with respect to the environment Fig 2 150 100 50050100150 deg 2 0 2 Optic flow residual pixel s u x y v d q 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 Parabolic mirroroCam inside mount PixHawk flight controller Lidar PX4Flow Optic flow sensor oDroid XU4 x m y m x m y m x m y m x m y m A B deg Optic flow pixel s A B deg Optic flow residual pixel s 200 1000100200 0 0 5 1 1 5 1 5 1 0 5 Fig 2 Planar coordinate defi nition for a generic unstructured environment For motion restricted to a plane it is well known that wide fi eld optic fl ow patterns are spatially periodic that reside in L2 the space of square integrable and piecewise continuous functions and can be modeled using 3443 the fi rst N harmonics of the Fourier series as 18 QWF x a0 2 N X n 1 ancosn bnsinn an 1 Z Q x cosn d bn 1 Z Q x sinn d 2 For vehicle fl ight close to the centerline of a straight corridor the tangential optic fl ow profi le resembles a sine wave like pattern with the peak amplitude proportional to the forward speed and inversely proportional to the perpendicular distance of the vehicle to the wall Fig 3 depicts the instan taneous optic fl ow fi eld induced by simulated vehicle fl ight in a textured corridor with two small fi eld obstacles poles while Fig 4 depicts the instantaneous small fi eld residual signal reconstructed with sensor noise for the case shown in Fig 3 From these fi gures it is clear that the introduction of a small fi eld obstacle induces a high frequency spatial perturbation in the nominal optic fl ow pattern The objective then is to eliminate the low frequency wide fi eld content present in optic fl ow so that only the small fi eld SF signal which encodes motion cues necessary for small fi eld obstacle avoidance remains Additionally a thresholding mechanism will need to be implemented to mitigate the infl uence of high frequency noise 150 100 50050100150 deg 2 0 2 Optic flow residual pixel s u x y v d q 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 Parabolic mirroroCam inside mount PixHawk flight controller Lidar PX4Flow Optic flow sensor oDroid XU4 x m y m x m y m x m y m x m y m A B deg Optic flow pixel s A B deg Optic flow residual pixel s 200 1000100200 0 0 5 1 1 5 1 5 1 0 5 Fig 3 Spatial decomposition of instantaneous optic fl ow for straightline fl ight along a corridor laden with two small fi eld obstacles poles 150 100 50050100150 deg 2 0 2 Optic flow residual pixel s u x y v d q 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 Parabolic mirroroCam inside mount PixHawk flight controller Lidar PX4Flow Optic flow sensor oDroid XU4 x m y m x m y m x m y m x m y m A B deg Optic flow pixel s A B deg Optic flow residual pixel s 200 1000100200 0 0 5 1 1 5 1 5 1 0 5 Fig 4 Optic fl ow residual signal B FD cell analogue In this section a method for extracting the high frequency content of optic fl ow is introduced that is based on modeling the analogs of the small fi eld motion sensitive interneurons found in the lobula plate of the insect visuomotor system 24 When fl ying through an unknown potentially cluttered environment the presence of small objects manifests as high frequency perturbations in the raw tangential optic fl ow signal In order to extract this high frequency content the wide fi eld WF content that encodes low spatial frequency information of the surrounding environment needs to be removed from the measured optic fl ow signal This is ac complished by reconstructing the wide fi eld signal using the fi rst four Fourier harmonics in Eq 2 The SF signal is then approximated as the residual of this Fourier series as QSF x Q x QWF x 3 C Steering control The SF signal Eq 3 encodes high spatial frequency information of the surrounding environment The magnitude and azimuthal location of these high frequency perturbations then provide information about the relative range and bearing of small fi eld objects in the local environment at any instant A nonlinear discontinuous steering control law based on an artifi cial potential function is then constructed as 27 usc k0 sign 0 e c 0 e cd d0 4 where the parameters d0and 0are the inverse of the max imum value of the magnitude and corresponding azimuthal location of the SF signal respectively d0corresponds to the relative range and 0corresponds to relative bearing of the obstacle at any instant Note that as a vehicle approaches an obstacle the magnitude of the small fi eld optic fl ow increases hence d0decreases which results in an larger corrective yaw command that steers the vehicle away from the obstacle The discontinuous function sign 0 is used to ensure that the control law results in a yaw rate command that produces a corrective maneuver to steer the vehicle away from obstacles located on either the port or the starboard side of the vehicle The gain parameters k0 c and cdcan be used to tune 150 100 50050100150 deg 2 0 2 Optic flow residual pixel s u x y v d q 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 Parabolic mirroroCam inside mount PixHawk flight controller Lidar PX4Flow Optic flow sensor oDroid XU4 x m y m x m y m x m y m x m y m A B deg Optic flow pixel s A B deg Optic flow residual pixel s 200 1000100200 0 0 5 1 1 5 1 5 1 0 5 Fig 5 Schematic diagram of quadrotor components 3444 150 100 50050100150 deg 2 0 2 Optic flow residual pixel s u x y v d q 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 Parabolic mirroroCam inside mount PixHawk flight controller Lidar PX4Flow Optic flow sensor oDroid XU4 x m y m x m y m x m y m x m y m A B deg Optic flow pixel s A B deg Optic flow residual pixel s 200 1000100200 0 0 5 1 1 5 1 5 1 0 5 Fig 6 Information fl ow diagram for bio inspired small object detection and avoidance control authority and vehicle responsiveness to obstacles in the azimuthal and radial directions respectively The presence of sensor noise in the small fi eld signal could result in spurious detection of obstacles in the local environment In order to address this issue a threshold detection algorithm is implemented in this study which requires the magnitude peak to cross a user defi ned threshold before a corrective yaw rate command is produced This eliminates the problem of spurious detection by producing a corrective maneuver only in the presence of a true obstacle III EXPERIMENTAL SETUP In this section the specifi cs of the sUAS setup used for testing the details of the control architecture as well as specifi cs of the vision sensor hardware are presented A Vehicle Setup The system used for testing is built around the DJI F330 frame which has a motor span of 33 cms see Fig 5 Attitude control and rate tracking of the vehicle is accomplished on a PixHawk running the PX4 fl ight stack by fusing the sensor data from rate gyros accelerometers and magnetometers The PX4 fl ight stack is open source fi rmware that is used for attitude control as well as control command processing It is a system requirement that the vehicle be able to hold position and have an accurate onboard estimate of the vehicle s body velocities For this reason a downward facing PX4Flow optic fl ow sensor and LidarLite v3 range sensor are added to the ventral side of the system to provide height and lateral velocity estimates With the addition of the data 150 100 50050100150 deg 2 0 2 Optic flow residual pixel s u x y v d q 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 4 2024 0 2 4 6 8 Parabolic mirroroCam inside mount PixHawk flight controller Lidar PX4Flow Optic flow sensor oDroid XU4 x m y m x m y m x m y m x m y m A B deg Optic flow pixel s A B deg Optic flow residual pixel s 200 1000100200 0 0 5 1 1 5 1 5 1 0 5 Fig 7 A Snapshot of the nominal red and perturbed orange tangential smal

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论