簡易檢索 / 詳目顯示

研究生: 拉斐爾
Galimant, Raphaelle
論文名稱: 基於點與線特徵之強健立體視覺慣性同步定位與建圖研究
Study of a Robust Stereo Visual-Inertial SLAM based on Point and Line Features
指導教授: 賴盈誌
Lai, Ying-Chih
學位類別: 碩士
Master
系所名稱: 工學院 - 航空太空工程學系
Department of Aeronautics & Astronautics
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 76
中文關鍵詞: 視覺慣性里程計點和線特徵同步定位與地圖建置視覺定位系統
外文關鍵詞: Visual Inertial Odometry, Point and Line Features, Simultaneous Localization and Mapping, Visual-Inertial Navigation Systems
相關次數: 點閱:61下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著自動航空系統的發展,無人機在輕巧、可靠且準確性方面有很大的重要性。這種無人機的最低傳感器組合將包括攝影機和慣性測量單元(IMU)。憑藉這些傳感器提供的測量數據和堅固的視覺慣性里程計演算法,無人機能夠及時計算自身狀態。這樣的資訊對於確保自主飛行非常重要。視覺慣性里程計演算法的發展是自主系統研究的一個重要方向。然而,視覺慣性里程計演算法仍然面臨一些問題。在具有挑戰性的環境中,視覺慣性里程計演算法可能會失去精度。大多數視覺慣性里程計演算法依賴點特徵來處理和估計系統的位置和方向。點特徵提供的資訊可能會受到周圍環境的影響。在特徵點較少和低光環境等具有挑戰性的環境中,視覺慣性里程計演算法的精度會下降。在這項研究中,我們開發了一種用於無人機自主飛行的強健視覺慣性里程計算法。基於點特徵的視覺慣性導航系統已經得到改進,考慮到一種新的特徵類型,即線的特徵。我們研究和開發了將線特徵與現有的視覺慣性里程計算法融合的方法,並使用C++進行了實現。通過ROS模擬開源數據集來評估所提出系統的性能。實驗結果表明,所提出的方法有效地提高了原始視覺慣性里程計演算法的精度。

    With the development of autonomous aeronautical systems, there is an important demand for lightweight, reliable, and accurate Unmanned Aerial Vehicles (UAVs). The minimum set of sensors for such a drone would be composed of cameras and an inertial measurement unit. With the measurements provided by these sensors and with a robust visual-inertial odometry algorithm, the UAV would be able to estimate its own states in real-time. Such information is crucial to ensure an autonomous flight. The development of VIO algorithms is a major interest for research in autonomous systems. However, VIO algorithms still faced some issues. VIO algorithms can encounter some loss of accuracy in challenging environments. Most of the VIO algorithms rely on point features to process and estimate the position and orientation of their systems. Information provided by point features can be impacted by the type of the surrounding environment. In challenging environments such as low-textured and low-light environments, the accuracy of a VIO algorithm would deteriorate. In this study, we developed a robust VIO algorithm for the autonomous flight of a drone. A visual-inertial navigation system based on point features has been improved to take into consideration a new type of feature which is the line feature. The way to fuse the line features to an existing VIO algorithm has been studied and developed in C++. ROS simulations of a public dataset are demonstrated to evaluate the performance of the proposed system. The experimental results show that the proposed method is effectively improving the accuracy of the original VIO algorithm.

    Abstract I 中文摘要 II Acknowledgements III Contents IV List of Tables VI List of Figures VII List of Symbols IX 1 INTRODUCTION 1 1.1 Research Background 1 1.2 Literature Review 3 1.3 Motivation and Objectives 9 2 VINS-FUSION AND LINE FEATURES PROCESSING 10 2.1 VINS-Fusion: A Stereo Visual-Inertial Odometry Algorithm 10 2.2 Line Features Processing 21 2.2.1 Line Features Detection 21 2.2.2 Line Features Matching 23 2.2.3 Geometric Representation of the Lines 23 3 METHODOLOGY 27 3.1 VINS-Stereo-PL: System Overview 27 3.2 Tightly Coupled Stereo VINS 28 3.2.1 Sliding Window Formulation 28 3.2.2 Line Feature Measurement Model 30 3.3 Development of the VI-SLAM System 34 3.3.1 Environment Setup 34 3.3.2 Implementation of the Line Extraction Process 38 3.3.3 Implementation of the Stereo Lines’ Triangulation 39 3.3.4 Implementation of the Optimization with Line Features 40 4 SIMULATIONS AND RESULTS 45 4.1 EuRoC MAV Dataset 45 4.2 Simulation Environment on RViz 48 4.3 Dataset Comparison 49 4.4 Comparison of Accuracy Results with Similar Research 63 4.5 Real-Time Analysis 63 Conclusion and Future Work 69 References 71 Appendix 73

    1. Scaramuzza, D. and Z. Zhang, Visual-Inertial Odometry of Aerial Robots. arXiv preprint arXiv:1906.03289, 2019.
    2. Zhou, F., et al., Improved Point-Line Feature Based Visual SLAM Method for Complex Environments. Sensors (Basel), 2021. 21(13).
    3. Huang, G. Visual-inertial navigation: A concise review. in International Conference on Robotics and Automation (ICRA). 2019. IEEE.
    4. Yousif, K., A. Bab-Hadiashar, and R. Hoseinnezhad, An Overview to Visual Odometry and Visual SLAM: Applications to Mobile Robotics. Intelligent Industrial Systems, 2015. 1(4): p. 289-311.
    5. Chen, C., et al., A Review of Visual-Inertial Simultaneous Localization and Mapping from Filtering-Based and Optimization-Based Perspectives. Robotics, 2018. 7(3).
    6. Gui, J., et al., A review of visual inertial odometry from filtering and optimisation perspectives. Advanced Robotics, 2015. 29(20): p. 1289-1301.
    7. Tong Qin, Peiliang Li, and Shaojie Shen, VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Transactions on Robotics, 2018. 34(4): p. 1004-1020.
    8. Delmerico, J. and D. Scaramuzza. A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots. in 2018 IEEE international conference on robotics and automation (ICRA). 2018. IEEE.
    9. VINS-Fusion : An optimization-based multi-sensor state estimator. Available from: https://github.com/HKUST-Aerial-Robotics/VINS-Fusion.
    10. Qin, T., et al., A General Optimization-based Framework for Global Pose Estimation with Multiple Sensors. arXiv preprint arXiv:1901.03642, 2019.
    11. Qin, T., et al., A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors. arXiv preprint arXiv:1901.03638, 2019.
    12. Leutenegger, S., et al., Keyframe-Based Visual–Inertial SLAM Using Nonlinear Optimization. The International Journal of Robotics Research, 2014. 34(3): p. 314-334.
    13. George, A., Analysis of Visual-Inertial Odometry Algorithms for Outdoor Drone Applications. 2021.
    14. George, A., et al., Visual-Inertial Odometry Using High Flying Altitude Drone Datasets. Drones, 2023. 7(1).
    15. Gomez-Ojeda, R., J. Briales, and J. Gonzalez-Jimenez. PL-SVO: Semi-Direct Monocular Visual Odometry by Combining Points and Line Segments. in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2016. IEEE.
    16. Pumarola, A., et al. PL-SLAM: Real-time Monocular Visual SLAM with Points and Lines. in 2017 IEEE international conference on robotics and automation (ICRA). 2017. IEEE.
    17. Gomez-Ojeda, R., et al., PL-SLAM: A Stereo SLAM System Through the Combination of Points and Line Segments. IEEE Transactions on Robotics, 2019. 35(3): p. 734-746.
    18. Zuo, X., et al. Robust Visual SLAM with Point and Line Features. in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2017. IEEE.
    19. Xue, X. and X. Lv, Stereo Visual Inertial SLAM Algorithm Fusing Point and Line Features, in 2022 China Automation Congress (CAC). 2022. p. 1687-1692.
    20. Zheng, F., et al. Trifo-VIO: Robust and Efficient Stereo Visual Inertial Odometry using Points and Lines. in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018. IEEE.
    21. Wei, H., et al., A Point-Line VIO System With Novel Feature Hybrids and With Novel Line Predicting-Matching. IEEE Robotics and Automation Letters, 2021. 6(4): p. 8681-8688.
    22. Kong, X., et al., Tightly-Coupled Stereo Visual-Inertial Navigation Using Point and Line Features. Sensors (Basel), 2015. 15(6): p. 12816-33.
    23. He, Y., et al., PL-VIO: Tightly-Coupled Monocular Visual-Inertial Odometry Using Point and Line Features. Sensors (Basel), 2018. 18(4).
    24. Wen, H., J. Tian, and D. Li. PLS-VIO: Stereo Vision-inertial Odometry Based on Point and Line Features. in 2020 International Conference on High Performance Big Data and Intelligent Systems (HPBD&IS). 2020. IEEE.
    25. Fu, Q., et al., PL-VINS: Real-time Monocular Visual-Inertial SLAM with Point and Line Features. arXiv preprint arXiv:2009.07462, 2020.
    26. Zhao, Z., et al., PLI-VINS: Visual-Inertial SLAM Based on Point-Line Feature Fusion in Indoor Environment. Sensors (Basel), 2022. 22(14).
    27. Liu, X., S. Wen, and H. Zhang, A Real-Time Stereo Visual-Inertial SLAM System Based on Point-and-Line Features. IEEE Transactions on Vehicular Technology, 2023. 72(5): p. 5747-5758.
    28. Kanade, T., An Iterative Image Registration Technique with an Application to Stereo Vision (IJCAI).
    29. VINS-Mono: Body-Of-Knowledge. Available from: https://www.aiwerkstatt.com/wp-content/uploads/2020/08/VINS-Mono-Body-Of-Knowledge.pdf.
    30. IMU Propagation Derivations. Available from: https://docs.openvins.com/propagation.html.
    31. Wu, Y., Formula Derivation and Analysis of the VINS-Mono. arXiv preprint arXiv:1912.11986, 2019.
    32. Forster, C., et al., On-Manifold Preintegration for Real-Time Visual--Inertial Odometry. IEEE Transactions on Robotics, 2017. 33(1): p. 1-21.
    33. Gálvez-López, D. and J.D. Tardos, Bags of Binary Words for Fast Place Recognition in Image Sequences. IEEE Transactions on Robotics, 2012. 28(5): p. 1188-1197.
    34. Calonder, M., et al. Brief: Binary Robust Independent Elementary Features. in Computer Vision–ECCV 2010: 11th European Conference on Computer Vision, Heraklion, Crete, Greece, September 5-11, 2010, Proceedings, Part IV 11. 2010. Springer.
    35. Grompone von Gioi, R., et al., LSD: a Line Segment Detector. Image Processing On Line, 2012. 2: p. 35-55.
    36. Notes on Plücker Coordinate. 2023; Available from: https://alida.tistory.com/71.
    37. Bartoli, A. and P. Sturm, Structure-from-motion using lines: Representation, triangulation, and bundle adjustment. Computer Vision and Image Understanding, 2005. 100(3): p. 416-441.
    38. Holen, H.K. Camera Pose Estimation using Line Segments - A study of DLT-Plücker-Lines and its use case in a real time system. 2019; Available from: https://ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/2623324.
    39. Huang, S.-S., The Derivation of Jacobian Matrix for the Point-Line-Plane Reprojection Factor. matrix. 4: p. 4.
    40. Přibyl, B., P. Zemčík, and M. Čadik, Camera Pose Estimation from Lines using Plücker Coordinates, in Procedings of the British Machine Vision Conference 2015. 2015. p. 45.1-45.12.
    41. Burri, M., et al., The EuRoC Micro Aerial Vehicle Datasets. The International Journal of Robotics Research, 2016. 35(10): p. 1157-1163.
    42. Evo: Python package for the evaluation of odometry and SLAM. Available from: https://michaelgrupp.github.io/evo/.
    43. Sturm, J., et al. A Benchmark for the Evaluation of RGB-D SLAM Systems. in 2012 IEEE/RSJ international conference on intelligent robots and systems. 2012. IEEE.
    44. Useful tools for the RGB-D benchmark. Available from: https://cvg.cit.tum.de/data/datasets/rgbd-dataset/tools.

    無法下載圖示 校內:2028-06-14公開
    校外:2028-06-14公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE