Research progress on visual perception system of bionic flapping-wing aerial vehicles
-
摘要: 仿生撲翼飛行器是一類模仿鳥及昆蟲通過機翼主動運動產生升力和推力的飛行器。因具有飛行效率高,機動性強、隱蔽性好等優點,撲翼飛行器近年來受到越來越多的關注和研究。小型撲翼飛行器由于其精巧的結構和可操作性,能夠適應更復雜的環境,但也限制了其飛行負載能力和電池續航時間。在許多場景中,高質量和高功耗的傳感器不再適用于撲翼飛行器。自然界生物得到的信息絕大多數是通過視覺途徑獲取的。視覺作為一個獲取信息的有效途徑,在撲翼飛行器的應用中有著不可替代的作用。視覺傳感器具有質量輕、功耗低、圖像信息豐富等優點,因此非常適合于搭載在撲翼飛行器上。隨著微電子、圖像處理等技術的不斷發展,以撲翼飛行器為平臺的視覺感知系統也取得了重要進展。本文首先介紹了國內外幾款有代表性的撲翼飛行器的視覺感知系統,分為機載視覺感知系統和外部視覺感知系統兩類;然后簡述了三個系統關鍵技術即圖像消抖技術、目標檢測與識別技術、目標跟蹤技術的發展現狀,進而總結發現撲翼飛行器的視覺感知系統研究目前還處于起步階段;最后指出圖像消抖、機載實時處理、目標檢測與識別、三維重建等可以作為撲翼飛行器視覺感知系統的未來研究方向。Abstract: The bionic flapping-wing aerial vehicle (FWAV) is a kind of aerial vehicle that imitates birds and insects and generates lift and thrust forces using active wing movement. Given its advantages, such as high flight efficiency, strong maneuverability, and good imperceptibility, FWAVs have attracted considerable attention from researchers in recent years. Given its compact structure and easy operation, the small FWAV can adapt itself to complex environments. However, some restrictions are also imposed on its onboard load capacity and battery endurance time. That is, sensors with large weight and high power consumption are no longer suitable for FWAVs in many scenarios. To the best of our knowledge, most of the information obtained by organisms from nature is acquired through vision. As an efficient way to obtain information, vision plays an irreplaceable role in the application of FWAVs. Vision sensors have many advantages, such as light weight, low power consumption, and rich image information. Therefore, these sensors are suitable for FWAVs. With the development of microelectronics and image processing technologies, visual perception systems of the FWAV have also made important progress. First, this study introduces the visual perception system of several representative FWAVs at home and abroad, which can be classified into two categories, i.e., onboard and off-board visual perception systems. Then, this study briefly reviews three key technologies of the visual perception system of FWAVs, namely, image stabilization, object detection and recognition, and object tracking technologies. As a result, research on the visual perception system of FWAVs is still at the initial stage. Finally, this study provides the future research directions of the visual perception system of FWAVs, such as image stabilization, onboard real-time processing, object detection and recognition, and three-dimensional reconstruction.
-
表 1 視覺感知系統對比
Table 1. Comparison of visual perception systems
系統平臺 制作單位 種類 攝像頭數量 消抖 跟蹤 DelFly II Delft 機載 雙目 有 無 X翼撲翼機 首爾大學 機載+外部 單目+地面Vicon系統 無 有 Dove 西北工業大學 機載 單目 有 無 BinoicFlyingFox Festo 外部 雙紅外鏡頭 無 無 H2Bird UC Berkeley 外部 單目 無 無 仿昆蟲飛行器 哈佛大學 外部 10個Vicon攝像頭 無 無 www.77susu.com -
參考文獻
[1] Lee N, Lee S, Cho H, et al. Effect of flexibility on flapping wing characteristics in hover and forward flight. Comput Fluids, 2018, 173: 111 doi: 10.1016/j.compfluid.2018.03.017 [2] Zhang C, Rossi C. A review of compliant transmission mechanisms for bio-inspired flapping-wing micro air vehicles. Bioinspir Biomim, 2017, 12(2): 025005 doi: 10.1088/1748-3190/aa58d3 [3] De Croon G, Per?in M, Remes B D W, et al. The DelFly: Design Aerodynamics and Artificial Intelligence of a Flapping Wing Robot. Netherlands: Springer, 2016 [4] Tijmons S. Stereo Vision for Flapping Wing MAVs: Design of an Obstacle Avoidance System [Dissertation]. Delft: Delft University of Technology, 2012 [5] Ryu S, Kwon U, Kim H J. Autonomous flight and vision-based target tracking for a flapping-wing MAV // 2016 IEEE/RSJ International Conference on Intelligent Robots & Systems. Daejeon, 2016: 5645 [6] Yang W Q, Wang L G, Song B F. Dove: a biomimetic flapping-wing micro air vehicle. Int J Micro Air Veh, 2018, 10(1): 70 doi: 10.1177/1756829317734837 [7] Festo AG & Co. KG. BionicFlyingFox: ultra-lightweight flying object with intelligent kinematics[EB/OL]. Festo (2018-03)[2019-03-08].https://www.festo.com/group/en/cms/13130.htm [8] McCurdy L Y, Maniscalco B, Metcalfe J, et al. Anatomical coupling between distinct metacognitive systems for memory and visual perception. J Neurosci, 2013, 33(5): 1897 doi: 10.1523/JNEUROSCI.1890-12.2013 [9] Julian R C, Rose C J, Hu H, et al. Cooperative control and modeling for narrow passage traversal with an ornithopter MAV and lightweight ground station // Proceedings of the 2013 International Conference on Autonomous Agents and Multi-agent Systems. St. Paul, 2013: 103 [10] Bourque A E, Bedwani S, Carrier J F, et al. Particle filter-based target tracking algorithm for magnetic resonance-guided respiratory compensation: robustness and accuracy assessment. Int J Radiat Oncol Biol Phys, 2018, 100(2): 325 doi: 10.1016/j.ijrobp.2017.10.004 [11] Rosen M H, le Pivain G, Sahai R, et al. Development of a 3.2 g untethered flapping-wing platform for flight energetics and control experiments // IEEE International Conference on Robotics and Automation. Stockholm, 2016: 3227 [12] Dorociak R D, Cuddeford T J. Determining 3-D system accuracy for the Vicon 370 system. Gait Posture, 1995, 3(2): 88 [13] Touré B, Schanen J L, Gerbaud L, et al. EMC modeling of drives for aircraft applications: modeling process, EMI filter optimization, and technological choice. IEEE Trans Power Electron, 2013, 28(3): 1145 doi: 10.1109/TPEL.2012.2207128 [14] Tran X T, Oh H, Kim I R, et al. Attitude stabilization of flapping micro-air vehicles via an observer-based sliding mode control method. Aerosp Sci Technol, 2018, 76: 386 doi: 10.1016/j.ast.2018.01.045 [15] Grip H F, Fossen T I, Johansen T A, et al. Attitude estimation using biased gyro and vector measurements with time-varying reference vectors. IEEE Trans Autom Control, 2012, 57(5): 1332 doi: 10.1109/TAC.2011.2173415 [16] Wang T. Stabilizing Platform: U.S. Patent, 8938160. 2015-1-20 [17] Koh L P, Wich S A. Dawn of drone ecology: low-cost autonomous aerial vehicles for conservation. Trop Conserv Sci, 2012, 5(2): 121 doi: 10.1177/194008291200500202 [18] Dong J, Liu H B. Video stabilization for strict real-time applications. IEEE Trans Circuits Syst Video Technol, 2017, 27(4): 716 doi: 10.1109/TCSVT.2016.2589860 [19] Aguilar W G, Angulo C, Pardo J A. Motion intention optimization for multirotor robust video stabilization // 2017 CHILEAN Conference on Electrical, Electronics Engineering, Information and Communication Technologies. Pucon, 2017: 1 [20] Mingkhwan E, Khawsuk W. Digital image stabilization technique for fixed camera on small size drone // 2017 Third Asian Conference on Defence Technology. Phuket, 2017: 12 [21] Aguilar W G, Angulo C. Real-time model-based video stabilization for microaerial vehicles. Neural Process Lett, 2016, 43(2): 459 doi: 10.1007/s11063-015-9439-0 [22] Lim A, Ramesh B, Yang Y, et al. Real-time optical flow-based video stabilization for unmanned aerial vehicles. J Real-Time Image Process, 2017: 1 [23] Pae D S, An C G, Kang T K, et al. Advanced digital image stabilization using similarity-constrained optimization. Multimedia Tools Appl, 2018, 78(12): 16489 [24] Han J H, Ma Y, Zhou B, et al. A robust infrared small target detection algorithm based on human visual system. IEEE Geosci Remote Sens Lett, 2014, 11(12): 2168 doi: 10.1109/LGRS.2014.2323236 [25] Zorbas D, Razafindralambo T, Luigi D P P, et al. Energy efficient mobile target tracking using flying drones. Procedia Comput Sci, 2013, 19: 80 doi: 10.1016/j.procs.2013.06.016 [26] Chen S Z, Wang H P, Xu F, et al. Target classification using the deep convolutional networks for SAR images. IEEE Trans Geosci Remote Sens, 2016, 54(8): 4806 doi: 10.1109/TGRS.2016.2551720 [27] Minaeian S, Liu J, Son Y J. Vision-based target detection and localization via a team of cooperative UAV and UGVs. IEEE Trans Syst Man Cybern Syst, 2016, 46(7): 1005 doi: 10.1109/TSMC.2015.2491878 [28] Lin S G, Garratt M A, Lambert A J. Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment. Auton Robot, 2017, 41(4): 881 doi: 10.1007/s10514-016-9564-2 [29] Baek S S, Bermudez F L G, Fearing R S. Flight control for target seeking by 13 gram ornithopter // 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. San Francisco, 2011: 2674 [30] He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition // IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, 2016: 770 [31] Redmon J, Divvala S, Girshick R, et al. You only look once: unified, real-time object detection // IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, 2016: 779 [32] Liu Y, Jing X Y, Nie J H, et al. Context-aware three-dimensional mean-shift with occlusion handling for robust object tracking in RGB-D videos. IEEE Trans Multimedia, 2019, 21(3): 664 doi: 10.1109/TMM.2018.2863604 [33] Gan M G, Cheng Y L, Wang Y N, et al. Hierarchical particle filter tracking algorithm based on multi-feature fusion. J Syst Eng Electron, 2016, 27(1): 51 [34] Held D, Thrun S, Savarese S. Learning to track at 100 fps with deep regression networks // European Conference on Computer Vision. Amsterdam, 2016: 749 [35] Milan A, Rezatofighi S H, Dick A, et al. Online multi-target tracking using recurrent neural networks // Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. San Francisco, 2017: 4225 [36] Chen P, Dang Y J, Liang R H, et al. Real-time object tracking on a drone with multi-inertial sensing data. IEEE Trans Intell Transp Syst, 2018, 19(1): 131 doi: 10.1109/TITS.2017.2750091 [37] Scheper K Y W, Karásek M, De Wagter C, et al. First autonomous multi-room exploration with an insect-inspired flapping wing vehicle // 2018 IEEE International Conference on Robotics and Automation. Brisbane, 2018: 5546 [38] Lee J, Ryu S, Kim T, et al. Learning-based path tracking control of a flapping-wing micro air vehicle // 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid, 2018: 7096 [39] Butt A A, Collins R T. Multi-target tracking by lagrangian relaxation to min-cost network flow // IEEE Conference on Computer Vision and Pattern Recognition. Portland, 2013: 1846 [40] He W, Ding S Q, Sun C Y. Research progress on modeling and control of flapping-wing air vehicles. Acta Automatica Sin, 2017, 43(5): 685賀威, 丁施強, 孫長銀. 撲翼飛行器的建模與控制研究進展. 自動化學報, 2017, 43(5):685 [41] Lukin V P, Nosov V V, Torgaev A V. Features of optical image jitter in a random medium with a finite outer scale. Appl Opt, 2014, 53(10): B196 doi: 10.1364/AO.53.00B196 [42] He W, Huang H F, Chen Y N, et al. Development of an autonomous flapping-wing aerial vehicle. Sci China Inform Sci, 2017, 60(6): 063201 doi: 10.1007/s11432-017-9077-1 [43] Tijmons S, de Croon G C H E, Remes B D W, et al. Obstacle avoidance strategy using onboard stereo vision on a flapping wing MAV. IEEE Trans Robot, 2017, 33(4): 858 doi: 10.1109/TRO.2017.2683530 [44] Ryu S, Kim H J. Development of a flapping-wing micro air vehicle capable of autonomous hovering with onboard measurements // IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver, 2017: 3239 [45] Harik E H C, Guérin F, Guinand F, et al. UAV-UGV cooperation for objects transportation in an industrial area // IEEE International Conference on Industrial Technology. Seville, 2015: 547 -