<span id="fpn9h"><noframes id="fpn9h"><span id="fpn9h"></span>
<span id="fpn9h"><noframes id="fpn9h">
<th id="fpn9h"></th>
<strike id="fpn9h"><noframes id="fpn9h"><strike id="fpn9h"></strike>
<th id="fpn9h"><noframes id="fpn9h">
<span id="fpn9h"><video id="fpn9h"></video></span>
<ruby id="fpn9h"></ruby>
<strike id="fpn9h"><noframes id="fpn9h"><span id="fpn9h"></span>
  • 《工程索引》(EI)刊源期刊
  • 中文核心期刊
  • 中國科技論文統計源期刊
  • 中國科學引文數據庫來源期刊

留言板

尊敬的讀者、作者、審稿人, 關于本刊的投稿、審稿、編輯和出版的任何問題, 您可以本頁添加留言。我們將盡快給您答復。謝謝您的支持!

姓名
郵箱
手機號碼
標題
留言內容
驗證碼

仿生撲翼飛行器的視覺感知系統研究進展

付強 陳向陽 鄭子亮 李擎 賀威

付強, 陳向陽, 鄭子亮, 李擎, 賀威. 仿生撲翼飛行器的視覺感知系統研究進展[J]. 工程科學學報, 2019, 41(12): 1512-1519. doi: 10.13374/j.issn2095-9389.2019.03.08.001
引用本文: 付強, 陳向陽, 鄭子亮, 李擎, 賀威. 仿生撲翼飛行器的視覺感知系統研究進展[J]. 工程科學學報, 2019, 41(12): 1512-1519. doi: 10.13374/j.issn2095-9389.2019.03.08.001
FU Qiang, CHEN Xiang-yang, ZHENG Zi-liang, LI Qing, HE Wei. Research progress on visual perception system of bionic flapping-wing aerial vehicles[J]. Chinese Journal of Engineering, 2019, 41(12): 1512-1519. doi: 10.13374/j.issn2095-9389.2019.03.08.001
Citation: FU Qiang, CHEN Xiang-yang, ZHENG Zi-liang, LI Qing, HE Wei. Research progress on visual perception system of bionic flapping-wing aerial vehicles[J]. Chinese Journal of Engineering, 2019, 41(12): 1512-1519. doi: 10.13374/j.issn2095-9389.2019.03.08.001

仿生撲翼飛行器的視覺感知系統研究進展

doi: 10.13374/j.issn2095-9389.2019.03.08.001
基金項目: 國家重點研發計劃資助項目(2017YFB1300102);國家自然科學基金青年基金資助項目(61803025)
詳細信息
    通訊作者:

    E-mail: weihe@ieee.org

  • 中圖分類號: TP242.6

Research progress on visual perception system of bionic flapping-wing aerial vehicles

More Information
  • 摘要: 仿生撲翼飛行器是一類模仿鳥及昆蟲通過機翼主動運動產生升力和推力的飛行器。因具有飛行效率高,機動性強、隱蔽性好等優點,撲翼飛行器近年來受到越來越多的關注和研究。小型撲翼飛行器由于其精巧的結構和可操作性,能夠適應更復雜的環境,但也限制了其飛行負載能力和電池續航時間。在許多場景中,高質量和高功耗的傳感器不再適用于撲翼飛行器。自然界生物得到的信息絕大多數是通過視覺途徑獲取的。視覺作為一個獲取信息的有效途徑,在撲翼飛行器的應用中有著不可替代的作用。視覺傳感器具有質量輕、功耗低、圖像信息豐富等優點,因此非常適合于搭載在撲翼飛行器上。隨著微電子、圖像處理等技術的不斷發展,以撲翼飛行器為平臺的視覺感知系統也取得了重要進展。本文首先介紹了國內外幾款有代表性的撲翼飛行器的視覺感知系統,分為機載視覺感知系統和外部視覺感知系統兩類;然后簡述了三個系統關鍵技術即圖像消抖技術、目標檢測與識別技術、目標跟蹤技術的發展現狀,進而總結發現撲翼飛行器的視覺感知系統研究目前還處于起步階段;最后指出圖像消抖、機載實時處理、目標檢測與識別、三維重建等可以作為撲翼飛行器視覺感知系統的未來研究方向。

     

  • 圖  1  DelFly II撲翼飛行器及立體視覺感知系統

    Figure  1.  DelFly II flapping-wing aerial vehicle and stereo visual perception system

    圖  2  修改過的X翼撲翼機的實驗平臺。 (a) 原始模型 ;(b) 修改模型

    Figure  2.  Experimental platform of modified X-wing FWAV: (a) original model; (b) modified model

    圖  3  X翼撲翼飛行器與地面站的集成架構

    Figure  3.  Integration architecture of the X-wing FWAV platform with ground station

    圖  4  Dove的結構圖

    Figure  4.  Structure of Dove

    圖  5  BinoicFlyingFox與外部紅外相機系統

    Figure  5.  BionicFlyingFox and off-board infrared camera system

    圖  6  H2Bird撲翼機

    Figure  6.  H2Bird ornithopter MAV

    圖  7  H2Bird的協同控制系統

    Figure  7.  Cooperative control system of H2Bird

    圖  8  哈佛仿昆蟲飛行器

    Figure  8.  Insect-scale flapping-wing aerial vehicle by Harvard

    圖  9  仿昆蟲飛行器的Vicon運動捕捉系統

    Figure  9.  Vicon motion capture system of insect-scale flapping-wing aerial vehicle

    圖  10  微型機載云臺

    Figure  10.  Miniature onboard pan–tilt

    圖  11  Dove的機載攝像頭捕獲的圖像

    Figure  11.  Image captured by the onboard camera of Dove

    表  1  視覺感知系統對比

    Table  1.   Comparison of visual perception systems

    系統平臺制作單位種類攝像頭數量消抖跟蹤
    DelFly IIDelft機載雙目
    X翼撲翼機首爾大學機載+外部單目+地面Vicon系統
    Dove西北工業大學機載單目
    BinoicFlyingFoxFesto外部雙紅外鏡頭
    H2BirdUC Berkeley外部單目
    仿昆蟲飛行器哈佛大學外部10個Vicon攝像頭
    下載: 導出CSV
    <span id="fpn9h"><noframes id="fpn9h"><span id="fpn9h"></span>
    <span id="fpn9h"><noframes id="fpn9h">
    <th id="fpn9h"></th>
    <strike id="fpn9h"><noframes id="fpn9h"><strike id="fpn9h"></strike>
    <th id="fpn9h"><noframes id="fpn9h">
    <span id="fpn9h"><video id="fpn9h"></video></span>
    <ruby id="fpn9h"></ruby>
    <strike id="fpn9h"><noframes id="fpn9h"><span id="fpn9h"></span>
    www.77susu.com
  • [1] Lee N, Lee S, Cho H, et al. Effect of flexibility on flapping wing characteristics in hover and forward flight. Comput Fluids, 2018, 173: 111 doi: 10.1016/j.compfluid.2018.03.017
    [2] Zhang C, Rossi C. A review of compliant transmission mechanisms for bio-inspired flapping-wing micro air vehicles. Bioinspir Biomim, 2017, 12(2): 025005 doi: 10.1088/1748-3190/aa58d3
    [3] De Croon G, Per?in M, Remes B D W, et al. The DelFly: Design Aerodynamics and Artificial Intelligence of a Flapping Wing Robot. Netherlands: Springer, 2016
    [4] Tijmons S. Stereo Vision for Flapping Wing MAVs: Design of an Obstacle Avoidance System [Dissertation]. Delft: Delft University of Technology, 2012
    [5] Ryu S, Kwon U, Kim H J. Autonomous flight and vision-based target tracking for a flapping-wing MAV // 2016 IEEE/RSJ International Conference on Intelligent Robots & Systems. Daejeon, 2016: 5645
    [6] Yang W Q, Wang L G, Song B F. Dove: a biomimetic flapping-wing micro air vehicle. Int J Micro Air Veh, 2018, 10(1): 70 doi: 10.1177/1756829317734837
    [7] Festo AG & Co. KG. BionicFlyingFox: ultra-lightweight flying object with intelligent kinematics[EB/OL]. Festo (2018-03)[2019-03-08].https://www.festo.com/group/en/cms/13130.htm
    [8] McCurdy L Y, Maniscalco B, Metcalfe J, et al. Anatomical coupling between distinct metacognitive systems for memory and visual perception. J Neurosci, 2013, 33(5): 1897 doi: 10.1523/JNEUROSCI.1890-12.2013
    [9] Julian R C, Rose C J, Hu H, et al. Cooperative control and modeling for narrow passage traversal with an ornithopter MAV and lightweight ground station // Proceedings of the 2013 International Conference on Autonomous Agents and Multi-agent Systems. St. Paul, 2013: 103
    [10] Bourque A E, Bedwani S, Carrier J F, et al. Particle filter-based target tracking algorithm for magnetic resonance-guided respiratory compensation: robustness and accuracy assessment. Int J Radiat Oncol Biol Phys, 2018, 100(2): 325 doi: 10.1016/j.ijrobp.2017.10.004
    [11] Rosen M H, le Pivain G, Sahai R, et al. Development of a 3.2 g untethered flapping-wing platform for flight energetics and control experiments // IEEE International Conference on Robotics and Automation. Stockholm, 2016: 3227
    [12] Dorociak R D, Cuddeford T J. Determining 3-D system accuracy for the Vicon 370 system. Gait Posture, 1995, 3(2): 88
    [13] Touré B, Schanen J L, Gerbaud L, et al. EMC modeling of drives for aircraft applications: modeling process, EMI filter optimization, and technological choice. IEEE Trans Power Electron, 2013, 28(3): 1145 doi: 10.1109/TPEL.2012.2207128
    [14] Tran X T, Oh H, Kim I R, et al. Attitude stabilization of flapping micro-air vehicles via an observer-based sliding mode control method. Aerosp Sci Technol, 2018, 76: 386 doi: 10.1016/j.ast.2018.01.045
    [15] Grip H F, Fossen T I, Johansen T A, et al. Attitude estimation using biased gyro and vector measurements with time-varying reference vectors. IEEE Trans Autom Control, 2012, 57(5): 1332 doi: 10.1109/TAC.2011.2173415
    [16] Wang T. Stabilizing Platform: U.S. Patent, 8938160. 2015-1-20
    [17] Koh L P, Wich S A. Dawn of drone ecology: low-cost autonomous aerial vehicles for conservation. Trop Conserv Sci, 2012, 5(2): 121 doi: 10.1177/194008291200500202
    [18] Dong J, Liu H B. Video stabilization for strict real-time applications. IEEE Trans Circuits Syst Video Technol, 2017, 27(4): 716 doi: 10.1109/TCSVT.2016.2589860
    [19] Aguilar W G, Angulo C, Pardo J A. Motion intention optimization for multirotor robust video stabilization // 2017 CHILEAN Conference on Electrical, Electronics Engineering, Information and Communication Technologies. Pucon, 2017: 1
    [20] Mingkhwan E, Khawsuk W. Digital image stabilization technique for fixed camera on small size drone // 2017 Third Asian Conference on Defence Technology. Phuket, 2017: 12
    [21] Aguilar W G, Angulo C. Real-time model-based video stabilization for microaerial vehicles. Neural Process Lett, 2016, 43(2): 459 doi: 10.1007/s11063-015-9439-0
    [22] Lim A, Ramesh B, Yang Y, et al. Real-time optical flow-based video stabilization for unmanned aerial vehicles. J Real-Time Image Process, 2017: 1
    [23] Pae D S, An C G, Kang T K, et al. Advanced digital image stabilization using similarity-constrained optimization. Multimedia Tools Appl, 2018, 78(12): 16489
    [24] Han J H, Ma Y, Zhou B, et al. A robust infrared small target detection algorithm based on human visual system. IEEE Geosci Remote Sens Lett, 2014, 11(12): 2168 doi: 10.1109/LGRS.2014.2323236
    [25] Zorbas D, Razafindralambo T, Luigi D P P, et al. Energy efficient mobile target tracking using flying drones. Procedia Comput Sci, 2013, 19: 80 doi: 10.1016/j.procs.2013.06.016
    [26] Chen S Z, Wang H P, Xu F, et al. Target classification using the deep convolutional networks for SAR images. IEEE Trans Geosci Remote Sens, 2016, 54(8): 4806 doi: 10.1109/TGRS.2016.2551720
    [27] Minaeian S, Liu J, Son Y J. Vision-based target detection and localization via a team of cooperative UAV and UGVs. IEEE Trans Syst Man Cybern Syst, 2016, 46(7): 1005 doi: 10.1109/TSMC.2015.2491878
    [28] Lin S G, Garratt M A, Lambert A J. Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment. Auton Robot, 2017, 41(4): 881 doi: 10.1007/s10514-016-9564-2
    [29] Baek S S, Bermudez F L G, Fearing R S. Flight control for target seeking by 13 gram ornithopter // 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. San Francisco, 2011: 2674
    [30] He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition // IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, 2016: 770
    [31] Redmon J, Divvala S, Girshick R, et al. You only look once: unified, real-time object detection // IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, 2016: 779
    [32] Liu Y, Jing X Y, Nie J H, et al. Context-aware three-dimensional mean-shift with occlusion handling for robust object tracking in RGB-D videos. IEEE Trans Multimedia, 2019, 21(3): 664 doi: 10.1109/TMM.2018.2863604
    [33] Gan M G, Cheng Y L, Wang Y N, et al. Hierarchical particle filter tracking algorithm based on multi-feature fusion. J Syst Eng Electron, 2016, 27(1): 51
    [34] Held D, Thrun S, Savarese S. Learning to track at 100 fps with deep regression networks // European Conference on Computer Vision. Amsterdam, 2016: 749
    [35] Milan A, Rezatofighi S H, Dick A, et al. Online multi-target tracking using recurrent neural networks // Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. San Francisco, 2017: 4225
    [36] Chen P, Dang Y J, Liang R H, et al. Real-time object tracking on a drone with multi-inertial sensing data. IEEE Trans Intell Transp Syst, 2018, 19(1): 131 doi: 10.1109/TITS.2017.2750091
    [37] Scheper K Y W, Karásek M, De Wagter C, et al. First autonomous multi-room exploration with an insect-inspired flapping wing vehicle // 2018 IEEE International Conference on Robotics and Automation. Brisbane, 2018: 5546
    [38] Lee J, Ryu S, Kim T, et al. Learning-based path tracking control of a flapping-wing micro air vehicle // 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid, 2018: 7096
    [39] Butt A A, Collins R T. Multi-target tracking by lagrangian relaxation to min-cost network flow // IEEE Conference on Computer Vision and Pattern Recognition. Portland, 2013: 1846
    [40] He W, Ding S Q, Sun C Y. Research progress on modeling and control of flapping-wing air vehicles. Acta Automatica Sin, 2017, 43(5): 685

    賀威, 丁施強, 孫長銀. 撲翼飛行器的建模與控制研究進展. 自動化學報, 2017, 43(5):685
    [41] Lukin V P, Nosov V V, Torgaev A V. Features of optical image jitter in a random medium with a finite outer scale. Appl Opt, 2014, 53(10): B196 doi: 10.1364/AO.53.00B196
    [42] He W, Huang H F, Chen Y N, et al. Development of an autonomous flapping-wing aerial vehicle. Sci China Inform Sci, 2017, 60(6): 063201 doi: 10.1007/s11432-017-9077-1
    [43] Tijmons S, de Croon G C H E, Remes B D W, et al. Obstacle avoidance strategy using onboard stereo vision on a flapping wing MAV. IEEE Trans Robot, 2017, 33(4): 858 doi: 10.1109/TRO.2017.2683530
    [44] Ryu S, Kim H J. Development of a flapping-wing micro air vehicle capable of autonomous hovering with onboard measurements // IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver, 2017: 3239
    [45] Harik E H C, Guérin F, Guinand F, et al. UAV-UGV cooperation for objects transportation in an industrial area // IEEE International Conference on Industrial Technology. Seville, 2015: 547
  • 加載中
圖(11) / 表(1)
計量
  • 文章訪問數:  2449
  • HTML全文瀏覽量:  2963
  • PDF下載量:  197
  • 被引次數: 0
出版歷程
  • 收稿日期:  2019-03-08
  • 刊出日期:  2019-12-01

目錄

    /

    返回文章
    返回