| 研究生: |
孫意捷 Sun, Yi-Chieh |
|---|---|
| 論文名稱: |
基於ORB-SLAM並融合雙目與慣性測量單元之視覺慣導同時定位與建圖系統 Visual-Inertial SLAM by fusing Stereo and Inertial Measurement Units based on ORB-SLAM |
| 指導教授: |
詹劭勳
Jan, Shau-Shiun |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 航空太空工程學系 Department of Aeronautics & Astronautics |
| 論文出版年: | 2018 |
| 畢業學年度: | 106 |
| 語文別: | 英文 |
| 論文頁數: | 88 |
| 中文關鍵詞: | 視覺慣導 、雙目相機 、感測器融合 、同時定位與建圖 、ORB-SLAM |
| 外文關鍵詞: | visual-inertial, stereo camera, sensor fusion, simultaneous localization and mapping, ORB-SLAM |
| 相關次數: | 點閱:101 下載:1 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來,同時定位與建圖系統(SLAM)是一個熱門討論的話題,且廣泛應用於各種產品當中。視覺慣導同時定位與建圖將慣性測量單元與相機進行融合,由慣性測量單元提供兩楨之間的運動狀況,而相機兩楨之間的變換資訊也可以提供慣性測量單元進行誤差校正,藉由兩種感測器互相彌補彼此的缺點,得以使系統更穩健與準確。
受到單目視覺慣導ORB-SLAM的啟發,我們從其應用的方法中學習,來達成我們想要開發的系統,該方法將觀測量進行緊耦合且採用基於關鍵楨的優化方法。除此之外,我們改用雙目相機,因為在雙目系統下,尺度資訊將變為已知。我們也採用了預積分的方法,將連續兩關鍵楨間的慣性單元關測量整合成一個部件,此部件就代表了兩關鍵楨之間的運動變化。在初始化過程中,慣性測量單元的初始化參數能夠自動且可靠的產生,當完成初始化參數的估計後,整個系統開始融合相機和慣性測量單元的觀測量來進行追蹤定位。在我們的系統中,當尺度資訊是已知的時候,在初始化階段的估測方程式可以變得比較簡單以提升運算速度,本系統會藉由運行網路上所提供的數據集來進行分析,因為該數據集包含了各種困難度的資料可以提供使用。跟據實驗結果,在系統開始5到10秒內可完成系統初始化過程,而對於成功定位追蹤的數據集軌跡,可達到公分等級的精準度。
The simultaneous localization and mapping (SLAM) is a popular research topic that is widely implement in different product. Visual-inertial SLAM fuse the IMU with camera to provide additional motion information between two frames. Camera also provides image information for IMU to do bias correction. Thus, camera and IMU complement the weakness of each other and supply a more robust and accurate system.
Inspire by the work of monocular ORB-SLAM fusing with IMU, we follow the methods they adopted to achieve a visual-inertial SLAM, which are tightly-coupled and optimized-based method. In addition, we use stereo camera in order to acquire the information of scale. The IMU measurements between two keyframes are pre-integrated into a single compound measurement and automatically receive reliable IMU initialization varieties. As the stereo camera provides the scale information, the equation solving the scale and gravity vector can be simplified and thus speed up the calculation. We test our system in an online micro-aerial vehicle public database, which contains several difficulties. Proposed system costs 5 to 10 seconds to achieve initialization. The cases that has been successfully tracked reach a centimeter level accuracy.
[1] THRUN, Sebastian. Simultaneous localization and mapping. In: Robotics and cognitive approaches to spatial mapping. Springer, Berlin, Heidelberg, 2007. p. 13-41.
[2] Mur-Artal, Raul, Jose Maria Martinez Montiel, and Juan D. Tardos. "ORB-SLAM: a versatile and accurate monocular SLAM system." IEEE Transactions on Robotics 31.5 (2015): 1147-1163.
[3] Thrun, Sebastian, Wolfram Burgard, and Dieter Fox. Probabilistic robotics. MIT press, 2005
[4] Davison, Andrew J., et al. "MonoSLAM: Real-time single camera SLAM." IEEE transactions on pattern analysis and machine intelligence 29.6 (2007): 1052-1067.
[5] Davison, Andrew J. "Real-time simultaneous localisation and mapping with a single camera." null. IEEE, 2003.
[6] Li, Alberto Quattrini, et al. "Experimental comparison of open source vision-based state estimation algorithms." International Symposium on Experimental Robotics. Springer, Cham, 2016.
[7] Klein, Georg, and David Murray. "Parallel tracking and mapping for small AR workspaces." Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on. IEEE, 2007.
[8] Mur-Artal, Raul, and Juan D. Tardós. "Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras." IEEE Transactions on Robotics 33.5 (2017): 1255-1262.
[9] Rublee, Ethan, et al. "ORB: An efficient alternative to SIFT or SURF." Computer Vision (ICCV), 2011 IEEE international conference on. IEEE, 2011.
[10] Mur-Artal, Raúl, and Juan D. Tardós. "Fast relocalisation and loop closing in keyframe-based SLAM." Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 2014.
[11] Engel, Jakob, Thomas Schöps, and Daniel Cremers. "LSD-SLAM: Large-scale direct monocular SLAM." European Conference on Computer Vision. Springer, Cham, 2014.
[12] Newcombe, Richard A., Steven J. Lovegrove, and Andrew J. Davison. "DTAM: Dense tracking and mapping in real-time." Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, 2011.
[13] Forster, Christian, Matia Pizzoli, and Davide Scaramuzza. "SVO: Fast semi-direct monocular visual odometry." Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 2014.
[14] Mur-Artal, Raúl, and Juan D. Tardós. "Visual-inertial monocular SLAM with map reuse." IEEE Robotics and Automation Letters 2.2 (2017): 796-803.
[15] Mourikis, Anastasios I., and Stergios I. Roumeliotis. "A multi-state constraint Kalman filter for vision-aided inertial navigation." Robotics and automation, 2007 IEEE international conference on. IEEE, 2007.
[16] BLOESCH, Michael, et al. Robust visual inertial odometry using a direct EKF-based approach. In: Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on. IEEE, 2015. p. 298-304.
[17] Leutenegger, Stefan, et al. "Keyframe-based visual–inertial odometry using nonlinear optimization." The International Journal of Robotics Research 34.3 (2015): 314-334.
[18] Chirikjian, Gregory S. Stochastic Models, Information Theory, and Lie Groups, Volume 2: Analytic Methods and Modern Applications. Vol. 2. Springer Science & Business Media, 2011.
[19] ROS monocular calibration tutorials. Available from: http://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration
[20] ROS stereo calibration tutorials. Available from: http://wiki.ros.org/camera_calibration/Tutorials/StereoCalibration
[21] Bradski, Gary, and Adrian Kaehler. Learning OpenCV: Computer vision with the OpenCV library. " O'Reilly Media, Inc.", 2008.
[22] Harris, Chris, and Mike Stephens. "A combined corner and edge detector." Alvey vision conference. Vol. 15. No. 50. 1988.
[23] Rosin, Paul L. "Measuring corner properties." Computer Vision and Image Understanding 73.2 (1999): 291-307.
[24] Lupton, Todd, and Salah Sukkarieh. "Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions." IEEE Transactions on Robotics 28.1 (2012): 61-76.
[25] Forster, Christian, et al. "On-Manifold Preintegration for Real-Time Visual--Inertial Odometry." IEEE Transactions on Robotics 33.1 (2017): 1-21.
[26] Kümmerle, Rainer, et al. "g 2 o: A general framework for graph optimization." Robotics and Automation (ICRA), 2011 IEEE International Conference on. IEEE, 2011.
[27] Burri, Michael, et al. "The EuRoC micro aerial vehicle datasets." The International Journal of Robotics Research 35.10 (2016): 1157-1163.
校內:2023-08-01公開