簡易檢索 / 詳目顯示

研究生: 黃大峯
Huang, Da-Feng
論文名稱: 應用於多旋翼機即時影像視覺導航
Real Time Onboard Camera-Based Navigation using Multi-Rotor
指導教授: 林清一
Lin, Chin E.
學位類別: 碩士
Master
系所名稱: 工學院 - 民航研究所
Institute of Civil Aviation
論文出版年: 2014
畢業學年度: 102
語文別: 英文
論文頁數: 56
中文關鍵詞: 視覺導航即時定位及投影四旋翼無人機嵌入式控制器
外文關鍵詞: Visual Navigation, Real Time Localization and Mapping, Quad-rotor UAV, Embedded Controller
相關次數: 點閱:187下載:10
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 近年來因為價格以及飛行控制板的普遍肇始各種無人飛機風行於各地,但是要讓小型的無人載具可以不受環境因素飛行是一個很大的課題。在經過嚴重的災難後,當原本的地圖以及許多無線設施無法使用的情況下,要如何設計一個導航系統能讓一個無人載具自主飛行。本文嘗試解決的即時定位以及建構地圖(SLAM)的問題。應於現有導航系統的不足,以及自主定位的需求,本研究設計一個即時的導航系統基於電腦視覺以及資料融合方法。在電腦視覺部分引入了一個新穎的視覺推估演算法,可以克服高頻影像追蹤並且推算出當前載具的位移量以及姿態。並且,因為單一相機並無法實際取得真實的尺度,引入了一個資料融合的模組化離散型卡爾曼濾波器(MSF-EKF),計算出還原至真實世界的尺度和校正在載具上不同座標系的旋轉量。除此之外,本文利用四軸飛行器搭載嵌入式電腦實際驗證這個導航系統。測試結果證明在滿足電腦視覺所需的條件後,可以穩定且精準的計算出所需的導航訊息。

    The development of micro aerial vehicle (MAV) has become a popular skillful work to turn into useful tool in many applications. The autonomous flight navigation in different environment is necessary. In devastating disaster, the terrain feature has changed. It is not easy to design a navigation system guild a MAV fly autonomously. Especially this area doesn’t have available map and usable radio beacons. The development focuses on the problem system to build a map and localize at same time, known Simultaneous Localization and Mapping (SLAM). Because of the drawback of existing GPS, a real-time navigation system is designed in this thesis based on visual SLAM and sensor fusion. In visual SLAM, an innovative visual odometry is introduced to overcome high frequency image and texture-less environment. It also can estimate translation and rotation of MAV. Besides, only monocular camera cannot recover true scale in real world. The proposed visual SLAM needs a modular multi-sensor fusion extend Kalman filter (MSF-EKF) to derive to real world and calibrate the frame in real-time. Moreover, a quad-rotor UAV is used to carry an embedded computer to verify the proposed navigation mechanism. The test result proves the proposed system can calculate navigation message in specific environment to satisfy the visual SLAM requirements.

    論文摘要 I ABSTRACT II 誌謝 III List of Figures VI Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Problem Background 4 1.3 Thesis Outline 4 Chapter 2 Visual Navigation 6 2.1 Introduction to Visual SLAM 6 2.2 Multiple View Geometry 9 2.3 Image Analysis 17 2.4 Visual SLAM 19 2.5 Summary 21 Chapter 3 Data Filtering 22 3.1 Kalman Filter 22 3.2 Multi-Sensor-Fusion Extended Kalman Filter 25 Chapter 4 Camera-Based Multi-Rotor Navigation 29 4.1 Multi-Rotor System 29 4.2 Experiment System Setup 31 4.3 Visual Odometry Test 36 4.4 Outdoor Experiments 42 4.5 Summary 51 Chapter 5 Conclusion 53 5.1 Conclusion 53 5.2 Future Work 54 References 55

    [1] P. Pounds, R. Mahony, and P. Corke, “Modelling and Control of a Quad-rotor Robot,” Proceedings Australasian Conference on Robotics and Automation, 2006.
    [2] MAVLink Micro Air Vehicle Communication Protocol, available in June 2014 from website: http://qgroundcontrol.org/mavlink/start.
    [3] M. Achtelik, A. Bachrach, R. He, S. Prentice, and N. Roy, “Autonomous Navigation and Exploration of a Quadrotor Helicopter in GPS-denied Indoor Environments,” First Symposium on Indoor Flight, 2009.
    [4] G. Klein and D. Murray, “Parallel Tracking and Mapping on a Camera Phone,” Proceedings of the Eigth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR’09), Orlando, 2009.
    [5] S. Weiss, M. W. Achtelik, S. Lynen, M. C. Achtelik, L. Kneip, M. Chli, and R. Siegwart, “Monocular Vision for Long-term Micro Aerial Vehicle State Estimation: A Compendium,” Journal of Field Robotics, vol. 30, no. 5, pp. 803–831, Sep. 2013.
    [6] R. A. Newcombe, S. J. Lovegrove, and A. J. Davison, “DTAM: Dense Tracking and Mapping in Real-time,” Computer Vision (ICCV), 2011 IEEE International Conference on, 2011, pp. 2320–2327.
    [7] Y. Ma, S. Soatto, J. Košecká, and S. S. Sastry, “An Invitation to 3-D Vision,” vol. 26. New York, NY: Springer New York, 2004.
    [8] F. Devernay and O. D. Faugeras, “Automatic calibration and removal of distortion from scenes of structured environments,” SPIE’s 1995 International Symposium on Optical Science, Engineering, and Instrumentation, 1995, pp. 62–72.
    [9] J. J. Engel, “Autonomous Camera-Based Navigation of a Quadrocopter,” Master's thesis, Technical University Munich, 2011.
    [10] S. M. Weiss, “Vision Based Navigation for Micro Helicopters,” Ph.D. Dissertation, Eidgenössische Technische Hochschule ETH Zürich, Nr. 20305, 2012, 2012.
    [11] C. Forster, M. Pizzoli, and D. Scaramuzza, “SVO: Fast Semi-Direct Monocular Visual Odometry,” Proc. IEEE Intl. Conf. on Robotics and Automation, 2014.
    [12] Epipolar constraint, available in June 2014 from website: http://en.wikipedia.org/wiki/Epipolar_geometry#mediaviewer/File:Epipolar_Geometry1.svg
    [13] S. Lynen, M. W. Achtelik, S. Weiss, M. Chli, and R. Siegwart, “A Robust and Modular Multi-sensor Fusion Approach Applied to MAV Navigation,” Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on, 2013, pp. 3923–3929.
    [14] N. Trawny and S. I. Roumeliotis, “Indirect Kalman Filter for 3D Attitude Estimation,” University of Minnesota, Dept. of Comp. Sci. & Eng., Tech. Rep. 2005-002, March 2005.
    [15] Introductory tutorial for using ethzasl_sensor_fusion, available in June 2014 from website:
    http://wiki.ros.org/ethzasl_sensor_fusion/Tutorials/getting_started.

    下載圖示 校內:立即公開
    校外:2015-09-01公開
    QR CODE