簡易檢索 / 詳目顯示

研究生: 麥耘菁
Mai, Yu-Ching
論文名稱: 使用單視覺影像定位與建圖方法結合慣性量測元件提升室內定位與導航之研究
Integration of Indoor Position and Navigation using Monocular SLAM and IMU
指導教授: 詹劭勳
Jan, Shau-Shiun
學位類別: 碩士
Master
系所名稱: 工學院 - 航空太空工程學系
Department of Aeronautics & Astronautics
論文出版年: 2014
畢業學年度: 103
語文別: 英文
論文頁數: 60
中文關鍵詞: 卡爾曼濾波器單視覺影像同步定位影像處理慣性量測元件
外文關鍵詞: Kalman Filter, Monocular simultaneous localization and mapping, Image Processing, inertial measurement unit
相關次數: 點閱:147下載:11
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 現今全球定為系統 (Global Position System, GPS)已發展多年,並且廣泛運用室外定位及導航,但由於衛星訊號因遭受高樓大廈及建築物遮蔽等因素而無法正常使用於室內空間,進而發展出許多室內定位的方法。室內環境複雜,除常見的聲納及雷射測距方法外,本論文主要研究為使用影像同步定位與建構室內空間演算法 (Simultaneous Localization and Mapping, SLAM)為主,並結合慣性量測元件 (Inertial measurement unit, IMU)為輔建立室內定位系統。隨著載具不斷移動時,單視覺影像同步定位與建構系統,量測特徵點與鏡頭的景深距離,搭配擴展式卡爾曼濾波器(Extended Kalman Filter)提供即時Local frame的位置、速度及姿態等資訊,慣性量測元件則提供相機即時的加速度及姿態資訊作為改善單視覺影像取得過少特徵點而造成的誤差。最後本論文利用多感測器結合演算法,多頻率卡爾曼濾波器(Multi-rate Kalman Filter)結合兩者感測器所提供之資訊整合輸出。實驗部分則使用BCB來完成特徵點的偵測和比對方法,使用函式庫執行Virtual Studio C# 收取MonoSLAM及IMU即時量測資訊,並利用Matlab做資料分析及驗證。

    GPS (Global Positioning System) dependent positioning and navigation has been developed over recent years, and has been widely used for outdoor positioning and navigation. However, high-rise buildings or indoor environments can block the satellite signal. Therefore, many indoor positioning methods have been developed to respond to this issue. In addition to measuring the distance using sonar and laser, this research uses monocular simultaneous localization and mapping (MonoSLAM) combined with an inertial measurement unit (IMU) to build an indoor positioning system. As time continues to move a vehicle, MonoSLAM measures the distance between the image features and the camera (depth). Making use of the Extend Kalman Filter (EKF), MonoSLAM provides real-time position, velocity and camera attitude. Because the feature points will not always appear and can't be trusted at all times, a wrong estimation will cause the position to diverge. The integrated system in this thesis uses the multi-rate Kalman Filter to complement each method. Finally, the experiment using Virtual Studio C# is shown to measure the MonoSLAM data and IMU data, and Matlab is used to verify the results.

    CHAPTER I INTRODUCTION 1 1.1 Introduction to Indoor Position 1 1.2 Literature Review 3 1.3 Motivation and Objectives 5 1.3 Thesis Outline 7 CHAPTER II Image Processing 9 2.1 Basics of Image Processing 9 2.1.1 Feature Detection (FAST) 9 2.1.2 Feature Correlation 12 2.1.3 Ideal Pinhole Model 14 2.2 Camera Calibration 17 2.3 Monocular Simultaneous Localization and Mapping 18 2.3.1 MonoSLAM and State Model 20 2.3.2 Extended Kalman Filter 21 2.3.3 Inverse Depth Parameterization 23 2.4 Data Association 24 2.4.1 Feature Initialization 24 2.4.2 Map Management 25 2.5 Interim Summery 25 CHAPTER III Inertial Measurement Unit 27 3.1 IMU Calibration 27 3.2 Multi-rate Kalman Filter 28 3.3 Interim Summary 32 CHAPTER IV Experiment and Result 34 4.1 Hardware and Software 35 4.1.1 Camera 35 4.1.2 Inertial Measurement Unit (IMU) 35 4.1.3 Wireless transmission 36 4.1.4 Hardware overview 37 4.1.5 Software 38 4.2 MonoSLAM alone test 39 4.3 Inertial Measurement Unit alone test 43 4.4 Integration of MonoSLAM and IMU test 45 4.5 Error Analysis 54 4.6 Interim Summary 55 CHAPTER V Conclusion 57 5.1 Concluding Remarks 57 5.2 Future Prospects 57 REFERENCE 59

    [1] Javier Civera., Oscar G. Grasa., Andrew J. Davision. and J.M.M. Montiel., “1-Point RANSAC for EKF Filtering. Application to Real-Time Structure from Motion and Visual Odometry“, IEEE/RS, Missouri, USA, 2009.
    [2] Farhad Aghili, “3D SLAM Using IMU and Its Observability Analysis”, IEEE, Xi’an, China, 2010.
    [3] Friedrich, F., Lionel, H., Dominik, H., Gim H. L., Lorenz, M., Petri, T. and Marc, P., “Based Autonomous Mapping and Exploration Using a Quadrotor MAV”, IEEE/RSJ, Vilamoura, Algarve, Portugal, 2012.
    [4] Gabriele Bleser, Didier Stricker, “Advanced tracking through efficient image processing and visual-inertial sensor fusion”, Elsevier, Darmsadt, Germany, 2008.
    [5] Cheng-Hsuan, C., Ta-Chung, W., “Simultaneous Localization and Mapping using Stereo Cameras”, National Cheng Kung University, Tainan, Taiwan (R.O.C), 2013.
    [6] G. Nutzi, S.Weiss, D, Scaramuzza, R. Siogwart, “Fusion of IMU and Vision for Absolute Scale Estmation in Monocular SLAM”, ETH Autonomous System Laboratory, Zurich, Switzerlamd, 2007.
    [7] Shohei Niwa, Takanobu Masuda and Yusuke Sezakim “Kalman Filter with Time-variable Gain for a Multisensot Fusion System”, IEEE, Taipei, Taiwan (R.O.C), 1999.
    [8] Andrew Smyth, Meiliang Wu, “Muli-rate Kalman filtering for the data fusion of displacement and acceleration response measurements dynamic system monitoring”, Elsevier, Columbia University, New York, USA, 2006.
    [9] Gabriele Bleser, Gustaf Hendeby, “Using Optical Flow for Fillng the gaps in Visual-Inertial Tracking”, 18th European Signal Processing Conference, Kaiserslautern, Germany, 2010.
    [10] KenYen Lee, JiunHaur Tarm, “Navigation Using SLAM Vision Technology”, National Cheng Kung University, Tainan, Taiwan (R.O.C), 2012.
    [11] TaoShu Lu, Shau-Shiun Jan, “Integration of Monocular Simultaneous Localization and Mapping with Fingerprinting for Indoor Position”, National Cheng Kung University, Tainan, Taiwan (R.O.C), 2014.
    [12] ChiCheng Chen, Liang Hwei Lee, “The Study of Image Matching Based on Feature point Selection and Description Method”, Civil Engineering, Kaohsiung, Taiwan (R.O.C), 2013.
    [13] ChiaChien Chang, GuanChun Luh, “Mobile Robot Indoor SLAM by Using Sensor Fusion”, Tatung University, Taipei, Taiwan (R.O.C), 2009.
    [14] Hung-Hsing Lin, Ching-Chih Tsai, “Localization of an Autonomous Mobile Robot by Fusing Active RFID and Ranging Laser Scanner”, Hsiupinh Journal, vol. 22, pp.1-18, 2011.
    [15] Q.H. Meng, Y.C. Sun and Z.L. Cao, “Adaptive Extend Kalman Filter Based Mobile Robot Localization using sonar”, Robotica, vol.21, no.4, pp.459-473, 2000.
    [16] Shoudong Huang, Zhan Wang, “Iterated SLSJF: Asparse local sub-map joining algorithm with improved consistency”, Australasian Conference on Robotics and Automation, Canberra, 2008.
    [17] Greq Welch, Gary Bishop, “An introduction to the Kalman Filter”, Department of Computer Science, University of North Carolina at Chapel Hill, 2006.
    [18] Rudy Negenborn, ”Robot Localization and Kalman Filters On finding your position in a noisy world”, Institute of Information and Computing Science, Utrecht University, 2003.
    [19] Jean-Yves Bouguet, “Camera Calibration Toolbox for Matlab”, last update 2013. http://www.vision.caltech.edu/bouguetj/calib_doc/
    [20] LORD MicroStrain, “3DM-GX3-45 Data Communications Protocol”, Williston, United State of America, 2013.
    [21] Li Che Tseng, Fei Bin Hsiao, "Optical Flow-based Obstacle Avoidance for Fixed Wing UAV in Uncertain Environment", National Cheng Kung University, Tainan, Taiwan (R.O.C), 2007.

    下載圖示 校內:立即公開
    校外:2018-08-06公開
    QR CODE