簡易檢索 / 詳目顯示

研究生: 邱奕澂
Chiu, Yi-Cheng
論文名稱: 使用GNSS結合慣性量測單元並以魚眼相機輔助之自駕車鬆耦合定位技術
GNSS/IMU Loosely Coupled Localization Aided with Fisheye Camera for Autonomous Vehicles
指導教授: 莊智清
Juang, Jyh-Ching
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 86
中文關鍵詞: 自動駕駛導航定位感測器融合非視距
外文關鍵詞: Autonomous Vehicle, Positioning and Navigation, Sensor Fusion, NLOS
相關次數: 點閱:120下載:19
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 車輛定位是自動駕駛系統中不可缺少的重要資訊,唯有獲得精準的車輛位置才能讓自駕車穩定地朝著目的地前進。現今的車輛定位技術主要藉由光達或相機的感測資訊獲得車輛位置,然而,當周圍環境特徵點不足或是場景發生劇烈變化時,這些定位方法就容易發生誤差。因此,本文提出一個多感測器融合的定位方法,在不易受周圍環境與天氣影響下,提供自動駕駛系統可靠的車輛位置、速度與姿態信息。
    全球衛星導航系統 (GNSS) 能在戶外環境中提供公尺級別的定位精度。此種感測方式的更新頻率較慢,且訊號被遮蔽時定位誤差容易放大,因此本文加入慣性量測單元 (IMU) 和魚眼相機進行整合。IMU提供高頻率的車輛動態信息,彌補GNSS資料更新期間資訊量不足的問題。透過魚眼相機360°的環景影像資訊,有助於移除天空中被遮蔽的衛星訊號,降低非視距 (NLOS) 衛星對定位系統的影響。最後,搭配擴增式卡爾曼濾波器 (EKF),結合各個感測器的感知結果,預估車輛當前的狀態。
    本文使用多感測融合的定位技術適用於各種不同的實驗場景,預估的車輛姿態訊息有助於提升後端自動駕駛演算法的表現。本文所提出的定位技術已經成功實現於商用自動駕駛高爾夫球車,並且在揚昇高爾夫鄉村俱樂部進行實車測試。

    A well-integrated autonomous driving system (ADS) needs a reliable vehicle position. The autonomous vehicle (AV) can move toward its destination steadily under the premise of an accurate vehicle position. Existing vehicle positioning methods rely on sensor information such as lidars or cameras. However, when there are not many distinguishable features around or the scenarios change dramatically, these vehicle localization methods may be subject to positioning errors. To free from interferences of scenario changing and weather changing, the thesis proposes a multi-sensor fusion method for vehicle positioning, which provides a reliable vehicle position, velocity, and attitude for the ADS.
    The Global Navigation Satellite System (GNSS) can provide a meter-level solution in open-sky environments. Different from others, GNSS has a lower refresh rate and its accuracy decreased when the signal is blocked. Therefore, the inertial measurement unit (IMU) and fisheye camera are integrated. IMU can provide high-rate vehicle dynamic information to make up vacancies between GNSS updates. The fisheye camera can provide a 360° surrounding sky image. The impact from non-line-of-sight (NLOS) satellite signals is removed by excluding blocked satellites in the transparent sky. At last, all sensor information is fused in an extended Kalman filter (EKF) for more precise vehicle status.
    This thesis proposes a multi-sensor fusion localization method that is applicable in various experiment scenarios. The estimated vehicle status could raise other localization algorithms to another level. The system is implemented in commercial golf carts and tested in Sunrise Golf & Country Club.

    摘要 I Abstract III Acknowledgments V Contents VI List of Tables IX List of Figures X List of Abbreviations XIII Chapter 1 Introduction 1 1.1 Motivation and Objectives 1 1.2 Literature Review 3 1.3 Contributions 5 1.4 Thesis Overview 6 Chapter 2 System Overview and Data Analysis 8 2.1 System Architecture 8 2.2 Coordinate Systems and Transformation 9 2.2.1 Coordinate Systems 10 2.2.2 Coordinate Transformation 13 2.3 Data Analysis 15 2.3.1 GPS Ephemeris 15 2.3.2 Galileo Ephemeris 20 Chapter 3 Loosely Coupled Localization 23 3.1 Sky and Non-Sky Detection 24 3.1.1 Fisheye Camera Calibration 24 3.1.2 Image Processing 26 3.2 LOS/NLOS Exclusion 29 3.2.1 Satellite Data Classification 30 3.2.2 Satellite to Image Coordinate Transformation 32 3.2.3 Satellite Visibility Classification 34 3.3 GNSS Positioning via Least Squares Method 36 3.4 INS Mechanization 39 3.5 Sensor Fusion with Extended Kalman Filter 43 3.5.1 System Model 45 3.5.2 Observation Model 47 Chapter 4 System Implementation and Evaluation 49 4.1 EZGO Golf Cart 49 4.1.1 Equipment 49 4.1.2 Software Development Kit 55 4.2 Perception Results 57 4.2.1 Validation Mechanism 58 4.2.2 Satellite Visibility Classification 61 4.2.3 Eliminate GNSS Positioning Error 63 4.3 Experiment Results 66 4.3.1 Sunrise Yangsheng Road 67 4.3.2 Sunrise Tunnel 72 4.4 Discussion 77 Chapter 5 Conclusions and Future Works 80 5.1 Conclusions 80 5.2 Future Works 81 Reference 83

    [1] SAE, “Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles J3016_202104,” 2021.
    [2] “The Autoware Foundation.” https://www.autoware.org/ (accessed May 16, 2023).
    [3] “Autopilot | Tesla.” https://www.tesla.com/autopilot (accessed May 16, 2023).
    [4] “Waymo.” https://waymo.com/ (accessed May 16, 2023).
    [5] S. Kuutti, S. Fallah, K. Katsaros, M. Dianati, F. Mccullough and A. Mouzakitis, “A Survey of the State-of-the-Art Localization Techniques and Their Potentials for Autonomous Vehicle Applications,” IEEE Internet of Things Journal, vol. 5, no. 2, pp. 829-846, April 2018, doi: 10.1109/JIOT.2018.2812300.
    [6] ISO, “Definition of a Global Concept for Local Dynamic Maps,” in Intelligent Transport System Cooperative Systems, ISO/TS 18750:2015, Accessed: March 29, 2017. [Online]. Available:https://www.iso.org/standard/63273.html.
    [7] “3D Laser Scanning.” http://www.meccanismocomplesso.org/laser-scanning-3d/2016 (accessed July 13, 2023).
    [8] F. Zhang et al., “A Sensor Fusion Approach for Localization with Cumulative Error Elimination,” in Proceedings of the IEEE Conf. Multisensor Fusion Integr. Intell. Syst. (MFI), 2012, pp. 1–6.
    [9] R. Mur-Artal, J. M. M. Montiel and J. D. Tardós, “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, October 2015, doi: 10.1109/TRO.2015.2463671.
    [10] R. Mur-Artal and J. D. Tardós, “ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras,” IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255-1262, October 2017, doi: 10.1109/TRO.2017.2705103.
    [11] J. K. Suhr, J. Jang, D. Min and H. G. Jung, “Sensor Fusion-Based Low-Cost Vehicle Localization System for Complex Urban Environments, ” IEEE Transactions on Intelligent Transportation Systems, vol. 18, no. 5, pp. 1078-1086, May 2017, doi: 10.1109/TITS.2016.2595618.
    [12] J. Breßler, P. Reisdorf, M. Obst and G. Wanielik, “GNSS Positioning in Non-line-of-sight Context—A Survey,” in 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 2016, pp. 1147-1154, doi: 10.1109/ITSC.2016.7795701.
    [13] O. Le Marchand, P. Bonnifait, J. Ba˜nez-Guzm´an, F. Peyret and D. Betaille, “Performance Evaluation of Fault Detection Algorithms as Applied to Automotive Localisation,” in European Navigation Conference - GNSS 2008, Toulouse, France, April 2008.
    [14] S. Bauer, R. Streiter, and G. Wanielik, “Non-line-of-sight Mitigation for Reliable Urban GNSS Vehicle Localization Using a Particle Filter,” in Information Fusion (Fusion) 2015 18th International Conference, July 2015, pp. 1664–1671.
    [15] J. -i. Meguro, T. Murata, J. -i. Takiguchi, Y. Amano and T. Hashizume, “GPS Multipath Mitigation for Urban Area Using Omnidirectional Infrared Camera, ” IEEE Transactions on Intelligent Transportation Systems, vol. 10, no. 1, pp. 22-30, March 2009, doi: 10.1109/TITS.2008.2011688.
    [16] J.S. Sánchez, A. Gerhmann, P. Thevenon, P. Brocard, Afia, A.B. Julien and O. Julicn, “Use of a FishEye Camera for GNSS NLOS Exclusion and Characterization in Urban Environments,” in Proceedings of the 2016 International Technical Meeting of The Institute of Navigation, Monterey, California, January 2016, pp. 283-292.
    [17] Paul Groves, “Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems,” Second Edition, Artech, 2013.
    [18] Aboelmagd Noureldin, Tashfeen B. Karamat and Jacques Georgy, “Fundamentals of Inertial Navigation, Satellite-based Positioning and their Integration,” Springer Science & Business Media, 2012.
    [19] Andrey Soloviev, Frank van Graas and Sanjeev Gunawardena, “Implementation of Deeply Integrated GPS/Low-Cost IMU for Reacquisition and Tracking of Low CNR GPS Signals,” in Proceedings of the 2004 National Technical Meeting of The Institute of Navigation, San Diego, CA, January 2004, pp. 923-935.
    [20] Greg Welch and Gary Bishop, “An Introduction to the Kalman Filter,” 2001.
    [21] L. Zhao, W. Ochieng, M. Quddus and R. Noland, “An Extended Kalman Filter Algorithm for Integrating GPS and Low Cost Dead Reckoning System Data for Vehicle Performance and Emissions Monitoring,” The Journal of Navigation, 56(2), 257-275, 2003. doi:10.1017/S0373463303002212.
    [22] Falco, Gianluca, Marco Pini and Gianluca Marucco, “Loose and Tight GNSS/INS Integrations: Comparison of Performance Assessed in Real Urban Scenarios, ” Sensors 17, no. 2: 255., 2017 https://doi.org/10.3390/s17020255.
    [23] W. Wen, X. Bai, Y. C. Kan and L. T. Hsu, “Tightly Coupled GNSS/INS Integration via Factor Graph and Aided by Fish-Eye Camera,” IEEE Transactions on Vehicular Technology, vol. 68, no. 11, pp. 10651-10662, November 2019, doi: 10.1109/TVT.2019.2944680.
    [24] Dunn Michae, “NAVSTAR GPS Space Segment/Navigation User Segment Interfaces IS-GPS-200,” 2020.
    [25] “GPS Navigation Message.”, https://gssc.esa.int/navipedia/index.php/GPS_Navigation_Message (accessed May 20, 2023).
    [26] European Union, “European GNSS (Galileo) Open Service Signal-In-Space Interface Control Document,” 2021.
    [27] “Galileo Navigation Message.”, https://gssc.esa.int/navipedia/index.php/Galileo_Navigation_Message (accessed June 10, 2023).
    [28] D. Scaramuzza, A. Martinelli and R. Siegwart, “A Toolbox for Easily Calibrating Omnidirectional Cameras,” in 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 2006, pp. 5695-5701, doi: 10.1109/IROS.2006.282372.
    [29] “Fisheye Calibration Basics.”, https://www.mathworks.com/help/vision/ug/fisheye-calibration-basics.html (accessed May 23, 2023).
    [30] “Open Source Computer Vision.”, https://docs.opencv.org/3.4/ (accessed May 24, 2023).
    [31] J. Kannala and S. S. Brandt, “A Generic Camera Model and Calibration Method for Conventional, Wide-angle, and Fish-eye Lenses,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 8, pp. 1335-1340, August 2006, doi: 10.1109/TPAMI.2006.153.
    [32] Y. He, R. Martin and A. M. Bilgic, “Approximate Iterative Least Squares Algorithms for GPS positioning,” The 10th IEEE International Symposium on Signal Processing and Information Technology, Luxor, Egypt, 2010, pp. 231-236, doi: 10.1109/ISSPIT.2010.5711784.
    [33] Wen Zhang, Mounir Ghogho and Baolun Yuan, “Mathematical Model and Matlab Simulation of Strapdown Inertial Navigation System,” Modelling and Simulation in Engineering, 2012 (2012): 1-25.
    [34] J. A. Farrell, F. O. Silva, F. Rahman and J. Wendel, “Inertial Measurement Unit Error Modeling Tutorial: Inertial Navigation System State Estimation with Real-Time Sensor Calibration, ” IEEE Control Systems Magazine, vol. 42, no. 6, pp. 40-66, December 2022, doi: 10.1109/MCS.2022.3209059.
    [35] J. H. Han, C. H. Park, J. H. Kwon, J. Lee, T. S. Kim and Y. Y. Jang, “Performance Evaluation of Autonomous Driving Control Algorithm for a Crawler-Type Agricultural Vehicle Based on Low-Cost Multi-Sensor Fusion Positioning, ” Appl. Sci. 2020, 10, 4667. https://doi.org/10.3390/app10134667.
    [36] “Trimble GNSS Planning Online.”, https://www.gnssplanning.com/#/skyplot (accessed June 11, 2023).
    [37] K. Su, S. Jin and M. M. Hoque, “Evaluation of Ionospheric Delay Effectson Multi-GNSS Positioning Performance,” Remote Sens. 2019, 11, 171. https://doi.org/10.3390/rs11020171.
    [38] E. P. Macalalad, L. C. Tsai and J. Wu, “The Application of Using Taiwan Ionospheric Model in Double-Difference Positioning in Single-Frequency GPS Data, ” in 32nd Asian Conference on Remote Sensing, 2011.
    [39] Hongyang Ma and Sandra Verhagen, “Precise Point Positioning on the ReliableDetection of Tropospheric Model Errors,” Sensors 2020, 20, 1634. https://doi.org/10.3390/s20061634.

    下載圖示 校內:立即公開
    校外:立即公開
    QR CODE