| 研究生: |
鄭兆順 Cheng, Chao-Shun |
|---|---|
| 論文名稱: |
在都市環境運用路側聯網裝置實現自駕車之多物件追蹤 Multiple Object Tracking for Autonomous Vehicles via Fusion of Networked Roadside Perception Units in Urban Environments |
| 指導教授: |
莊智清
Juang, Jyh-Ching |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2022 |
| 畢業學年度: | 110 |
| 語文別: | 英文 |
| 論文頁數: | 104 |
| 中文關鍵詞: | 自動駕駛車輛 、資料融合 、協同式感知 、路側聯網裝置 |
| 外文關鍵詞: | Autonomous Vehicle, Data Fusion, Cooperative Perception, Road Side Unit |
| 相關次數: | 點閱:137 下載:21 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
多物件追蹤演算法(Multiple Object Tracking) 在自動駕駛當中扮演相當重要的角色,其目標在於給予每個物件特定的編號並且預估出對應的狀態,唯有良好的物件追蹤結果,自動駕駛才有機會在開放道路中實現。一般來說,用來感知周遭物件的感測器為相機(Camera)、雷達(Radar) 與光達(Lidar);然而,上述的感測器經常會被天氣、光線、視角範圍等相關議題給限制住,更不用說障礙物所導致的視線死角問題。隨著車聯網(Vehicle to Everything) 的發展,協同式感知(Cooperative Perception) 被視為解決方案。
本文提出了一種運用路側聯網裝置與協同式感知之多物件追蹤演算法。在高階資訊融合(High-level Fusion) 的架構下,利用交互式多模型(Interacting Multiple Model) 搭配無跡卡爾曼濾波(Unscented Kalman Filter) 並結合分別來自路側聯網裝置(RSU) 與車載單元(OBU) 的感測結果,來預估周遭物件的狀態。除此之外,本文還針對路側聯網裝置之資料不確定性提出適應性測量誤差矩陣並且建立測量噪音圖(Measurement Noise Map)。
本文所提出之多物件追蹤演算法已經在 LGSVL 模擬器中驗證並分析,模擬結果顯示本文提出之演算法能夠有效地追蹤周遭環境物件,並且能夠克服非視距(Non Line of Sight) 等相關議題。除此之外,本文所提出之演算法已經成功運用在國立成功大學自動駕駛車輛當中,並且實際在台灣台南市歸仁區沙崙車站附近周遭道路搭配路側聯網裝置進行實車測試。
Multiple Object Tracking (MOT) plays an important role in the autonomous driving field. This task aims to maintain their distinct identity and determine the state of detected objects. Only if having good object tracking results, autonomous driving could be implemented. Typically, the most common solutions for environment perception depend on radar or cameras for perceiving surrounding objects. However, these sensors are limited by the field of view, extreme weather, and lighting conditions, not to mention blind spots caused by obstacles that occupy the sensor view.With the coming of V2V (vehicle to vehicle), V2I (vehicle to infrastructure), or V2X (vehicle to everything) communication, cooperative perception is regarded as the solution.
This thesis proposes a modular MOT algorithm using cooperative perception with Networked Roadside Perception Units for autonomous vehicles. This thesis intends to utilize Interacting Multiple Model (IMM) with Unscented Kalman Filter (UKF) and fusion of incoming data from a road-side unit (RSU) and an on-board unit (OBU) to estimate information of surrounding objects in the high-level fusion architecture. Besides, this thesis proposes an adaptive measurement covariance matrix for measurement uncertainties from RSUs by building the measurement noise map.
The proposed MOT algorithm is verified and analyzed in the simulated environment through the LGSVL simulator. The simulation result shows that the proposed method can effectively track the surrounding objects in the different scenarios, and it is capable of overcoming the NLOS issues. The most important is that the tracking result of the proposed method is better than only OBUs. On the other hand, the proposed method is implemented in the NCKU autonomous vehicle with RSUs on the public road in Tainan, Taiwan.
[1] !'MOTC,!( https://www.motc.gov.tw/en/index.jsp (accessed Dec. 13, 2021).
[2] !'Waymo,!( https://waymo.com/ (accessed Dec. 13, 2021).
[3] !'NVIDIA DRIVE,!( https://www.nvidia.com/zh-tw/self-driving-cars/ (accessed Dec. 13, 2021).
[4] !'Autopilot | Tesla,!( https://www.tesla.com/autopilot (accessed Dec. 13, 2021).
[5] W. G. Najm, J. Koopmann, J. D. Smith, and J. Brewer, "Frequency of Target Crashes for IntelliDrive Safety Systems," National Highway Traffic Safety Administration, Tech Report 2010.
[6] A. Caillot, S. Ouerghi, P. Vasseur, R. Boutteau and Y. Dupuis, "Survey on Cooperative Perception in an Automotive Context," in IEEE Transactions on Intelligent Transportation Systems, pp. 1-20, 2022.
[7] E. Hery, P. Xu and P. Bonnifait, "Pose and Covariance Matrix Propagation Issues in Cooperative Localization with LiDAR Perception," in IEEE Intelligent Vehicles Symposium (IV), pp. 1219-1224, 2019.
[8] D. Gulati, F. Zhang, D. Clarke, and A. Knoll, "Vehicle Infrastructure Cooperative Localization using Factor Graphs," in IEEE Intelligent Vehicles Symposium (IV), pp. 1085-1090, 2016.
[9] F. Ahammed, J. Taheri, A. Y. Zomaya, and M. Ott, !'Vloci: Using Distance Measurements to Improve the Accuracy of Location Coordinates in GPS-Equipped Vanets,!( in International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services, pp. 149-161, 2010.
[10] J. A. del Peral-Rosado, J. A. L!&opez-Salcedo, S. Kim, and G. Seco-Granados, !'Feasibility Study of 5G-Based Localization for Assisted Driving,!( in International Conference on Localization and GNSS (ICL-GNSS), pp. 1-6, 2016.
[11] O. Hassan, I. Adly, and K. Shehata, !'Vehicle Localization System Based on IR-UWB for V2I Applications,!( in 8th International Conference on Computer Engineering & Systems (ICCES). pp. 133-137, 2013.
[12] M. Rohani, D. Gingras, V. Vigneron, and D. Gruyer, !'A New Decentralized Bayesian Approach for Cooperative Vehicle Localization Based on Fusion of GPS and Vanet Based Inter-Vehicle Distance Measurement,!( IEEE Intelligent transportation systems magazine, vol. 7, no. 2, pp. 85!V95, 2015.
[13] S.-W. Kim, Z. J. Chong, B. Qin, X. Shen, Z. Cheng, W. Liu, and M. H. Ang, !'Cooperative Perception for Autonomous Vehicle Control on the Road: Motivation and Experimental Results,!( in IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5059-5066, 2013.
[14] A. Birk and S. Carpin, "Merging Occupancy Grid Maps From Multiple Robots," in Proceedings of the IEEE, vol. 94, no. 7, pp. 1384-1397, 2006.
[15] S. Masi, "Safe Autonomous Vehicles Navigation in Roundabouts with Cooperative Perception from an Intelligent Infrastructure," Ph.D. Thesis, 2021.
[16] G.Ciaparrone, F. L.Sanchez, S.Tabik, L.Troiano, R.Tagliaferri, andF.Herrera, !'Deep Learning in Video Multi-Object Tracking: A Survey,!( Neurocomputing, vol. 381, pp. 61-88, 2019.
[17] A.Bewley, Z.Ge, L.Ott, F.Ramos, and B.Upcroft, !'Simple Online and Realtime Tracking,!( in Proc. - Int. Conf. Image Process. ICIP, vol. 2016-August, pp. 3464- 3468, 2016.
[18] M.Camplani et al., !'Multiple Human Tracking in RGB-D Data: A Survey,!( 2016.
[19] C. Urmson et al., "Autonomous Driving in Urban Environments: Boss and the Urban Challenge," in The DARPA Urban Challenge: Autonomous Vehicles in City Traffic, pp. 1-59, 2009.
[20] M. Mahlisch, T. Kauderer, W. Ritter, and K. Dietmayer, !'Feature-Level Video and Multibeam Lidar Sensor Fusion for Full-Speed ACC State Estimation,!( in Proc. 4th Int. Workshop Intell. Transp., Hamburg, Germany, 2007.
[21] M. Munz, M. Mahlisch, and K. Dietmayer, !'Generic Centralized Multi-Sensor Data Fusion Based on Probabilistic Sensor and Environment Models for Driver Assistance Systems,!( IEEE Intell. Transp. Syst. Mag., vol. 2, no. 1, pp. 6!V17, 2010.
[22] M. Baek, D. Jeong, D. Choi, and S. Lee, "Vehicle Trajectory Prediction and Collision Warning via Fusion of Multisensors and Wireless Vehicular Communications," Sensors, vol. 20, no. 1, 2020.
[23] !'PreScan - Simulation of ADAS and Active Safety | TASS International,!( http://www.tassinternational.com/prescan (accessed Jun. 2, 2022).
[24] E. Arnold, M. Dianati, and R. Temple, !'Cooperative Perception for 3D Object Detection in Driving Scenarios using Infrastructure Sensors,!( 2019.
[25] A. Dosovitskiy, G. Ros, F. Codevilla, A. Lopez, and V. Koltun, !'CARLA: An Open Urban Driving Simulator,!( in Proceedings of the 1st Annual Conference on Robot Learning, 2017.
[26] M. Gabb, H. Digel, T. Muller, and R. Henn, !'Infrastructure-Supported Perception and Track-Level Fusion using Edge Computing,!( In 2019 IEEE Intelligent Vehicles Symposium (IV), pp. 1739-1745, 2019.
[27] Q. Chen, S. Tang, Q. Yang, and S. Fu, !'Cooper: Cooperative Perception for Connected Autonomous Vehicles based on 3d Point Clouds,!( in The 39th IEEE International Conference on Distributed Computing Systems (ICDCS), 2019.
[28] Q. Chen, X. Ma, S. Tang, J. Guo, Q. Yang, and S. Fu, !'F-Cooper: Feature based Cooperative Perception for Autonomous Vehicle Edge Computing System Using 3D Point Clouds,!( in IEEE/ACM Symposium on Edge Computing (SEC), 2019.
[29] A. Geiger, P. Lenz, and R. Urtasun, !'Are We Ready for Autonomous Driving? The KITTI Vision Benchmark Suite,!( in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3354!V3361, 2012.
[30] M. Shan, K. Narula, Y. F. Wong, S. Worrall, M. Khan, P. Alexander, and E. Nebot, !'Demonstrations of Cooperative Perception: Safety and Robustness in Connected and Automated Vehicle Operations,!( Sensors (Basel), vol. 21, no. 1, 2021.
[31] V. Lakshminarasimhan, F. Kurz, D. Rosenbaum, C. Lenz, and A. Knoll, "Providentia - A Large-Scale Sensor System for the Assistance of Autonomous Vehicles and Its Evaluation," Journal of Field Robotics, 2022.
[32] J. Einsiedler, O. Sawade, B. Schaufele, M. Witzke and I. Radusch, "Indoor Micro Navigation Utilizing Local Infrastructure-Based Positioning," IEEE Intelligent Vehicles Symposium, pp. 993-998, 2012.
[33] !'Google Public NTP!( https://developers.google.com/time (accessed Apr. 19, 2022).
[34] J. Laconte, S. -P. Deschenes, M. Labussiere, and F. Pomerleau, "Lidar Measurement Bias Estimation via Return Waveform Modelling in a Context of 3D Mapping," in International Conference on Robotics and Automation (ICRA), pp. 8100-8106, 2019.
[35] S. Soudarissanane, R. Lindenbergh, M. Menenti, and P. Teunissen, !'Scanning Geometry: Influencing Factor on the Quality of Terrestrial Laser Scanning Points,!( ISPRS Journal of Photogrammetry and Remote Sensing, vol. 66, pp. 389!V399, 2011.
[36] ML Blatchford, !'Sample Size Determination for Estimation of Sensor Detection Probabilities Based on a Test Variable,!( 2007.
[37] Jonathan Richard Shewchuk, !'Delaunay Refinement Algorithms for Triangular Mesh Generation,!( 2001.
[38] X. Rong Li and V. P. Jilkov, "Survey of Maneuvering Target Tracking. Part I. Dynamic models," IEEE Transactions on Aerospace and Electronic Systems, vol. 39, no. 4, 2003.
[39] M. E. Farmer, H. Rein-Lien, and A. K. Jain, "Interacting Multiple Model (IMM) Kalman Filters for Robust High-Speed Human Motion Tracking," in 2002 International Conference on Pattern Recognition, vol. 2, 2002.
[40] Y.-s. Kim and K.-S. Hong, !'An IMM Algorithm for Tracking Maneuvering Vehicles in an Adaptive Cruise Control Environment,!( Int. J. Control Autom. Syst., vol. 2, no. 3, 2004.
[41] N. Kaempchen and K. Dietmayer, !'IMM Vehicle Tracking for Traffic Jam Situations on Highways,!( in Proc. 7th Intl. Conf. Multisensor. Inf. Fusion, vol. 1, no. 2, 2004.
[42] E. A. Wan and R. Van Der Merwe, "The Unscented Kalman Filter for Nonlinear Estimation," Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium, pp. 153-158, 2000.
[43] M. R. Abbasifard, B. Ghahremani, and H. Naderi, "A Survey on Nearest Neighbor Search Methods," International Journal of Computer Applications, vol. 95, pp. 39-52, 2014.
[44] Y. Bar-Shalom, F. Daum, and J. Huang, !'The Probabilistic Data Association Filter,!( IEEE Control Syst. Mag., vol. 29, no. 6, pp. 82!V100, 2009.
[45] Y. Bar-Shalom and X. R. Li, !'Multitarget-Multisensor Tracking: Principles and Techniques,!( 1995.
[46] Y. Bar-Shalom, "On the Track-to-Track Correlation Problem," IEEE Transactions on Automatic Control, vol. AC-26, no. 2, pp. 571-572, 1981.
[47] George P. H. Styan, !'Hadamard Products and Multivariate Statistical Analysis,!( Linear Algebra and ITS Applications,!( 1973.
[48] SAE International, !'SAE, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, J3016,!( 2014.
[49] !'ROS.org | Powering the world!&s robots.!( https://www.ros.org/ (accessed Dec. 14, 2021).
[50] M.Quigley et al., !'ROS: an Open-Source Robot Operating System,!( in ICRA workshop on open source software, vol. 3, no. 3.2, 2009.
[51] !'RQT | ROS tool!( http://wiki.ros.org/rqt (accessed Dec. 14, 2021).
[52] S. Kato, S. Tokunaga, Y. Maruyama, S. Maeda, M. Hirabayashi, Y. Kitsukawa, A. Monrroy, T. Ando, Y. Fujii, and T. Azumi, !'Autoware on Board: Enabling Autonomous Vehicles with Embedded Systems,!( In Proceedings of the 9th ACM/IEEE International Conference on Cyber-Physical Systems (ICCPS2018), 2018.
[53] S. Kato, E. Takeuchi, Y. Ishiguro, Y. Ninomiya, K. Takeda, and T. Hamada. !'An Open Approach to Autonomous Vehicles,!( IEEE Micro, vol. 35, no. 6, 2015.
[54] !'Tier IV.!( https://tier4.jp/en/ (accessed Dec. 23, 2021).
[55] !'Nuvo-8108GC.!( https://www.neousys-tech.com/tw/ (accessed Dec. 29, 2021).
[56] !'OS1-64.!( https://ouster.com/ (accessed Dec. 29, 2021).
[57] SAE International, !'SAE, V2X Communications Message Set Dictionary, J2735,!( 2020.
[58] IEEE, !'IEEE Standard for Wireless Access in Vehicular Environments - Networking Services, 1609.3,!( 2020.
[59] IEEE, !'IEEE Standard for Wireless Access in Vehicular Environments - Multi-Channel Operation, 1609.4,!( 2016.
[60] IEEE, !'IEEE Standard for Information Technology - Local and Metropolitan Area Networks - Specific Requirements - Part 11: Wireless LAN Medium Access Control and Physical Layer Specifications Amendment 6: Wireless Access in Vehicular Environments, 802.11p,!( 2010.
[61] !'Unex DSRC.!( https://www.unex.com.tw/ (accessed Dec. 14, 2021).
[62] !'Taiwan Car Lab.!( https://taiwancarlab.narlabs.org.tw/zh-TW (accessed Dec. 14, 2021).
[63] !'SVL Simulator.!( https://www.svlsimulator.com/ (accessed Mar. 14, 2022).
[64] !'LGSVL Github!( https://github.com/lgsvl/simulator (accessed Mar. 14, 2022).
[65] !'Shalun!( https://content.lgsvlsimulator.com/maps/shalun/ (accessed Mar. 16, 2022).
[66] !'Thinktron company.!( https://www.thinktronltd.com/ (accessed Dec. 14, 2021).
[67] !'High-Definition Maps Research Center.!( http://www.hdm.geomatics.ncku.edu.tw/ (accessed Dec. 14, 2021).
[68] M. Himmelsbach, F. v. Hundelshausen and H. -. Wuensche, "Fast segmentation of 3D point clouds for ground vehicles," IEEE Intelligent Vehicles Symposium, pp. 560-565, 2010.
[69] Joseph Redmon and Ali Farhadi, !'YOLOv3: An Incremental Improvement,!( Computer Vision and Pattern Recognition, 2018.
[70] !'Mobileye.!( https://www.adasmobile.com/ (accessed May. 14, 2022).
[71] Zhai, Guan, et al. !'A Constant Speed Changing Rate and Constant Turn Rate Model for Maneuvering Target Tracking.!( Sensors (Basel, Switzerland), vol. 14, 2014.
[72] R. E. Kalman, !'A New Approach to Linear Filtering and Prediction Problems,!( Transaction of the ASME - Journal of Basic Engineering, pp. 35-45, 1960.
[73] Simon J. Julier and Jeffrey K. Uhlmann, "New Extension of the Kalman Filter to Nonlinear Systems," Proc. SPIE 3068, Signal Processing, Sensor Fusion, and Target Recognition VI, 1997.
[74] E. A. Wan and R. V. D. Merwe, "The Unscented Kalman Filter for Nonlinear Estimation," in Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373), pp. 153-158, 2000.
[75] S. Thrun, !'Particle Filters in Robotics,!( Proc. Uncertain. AI, vol. 1, pp. 511!V518, 2002.
[76] N. Metropolis and S. Ulam, !'The Monte Carlo Method,!( Journal of the American Statistical Association, Vol. 44, pp. 335-341, 1949.
[77] M. Sualeh and G. W. Kim, "Dynamic Multi-LiDAR Based Multiple Object Detection and Tracking," Sensors (Basel), vol. 19, no. 6, 2019.
[78] M. Aeberhard and N. Kaempchen, "High-Level Sensor Data Fusion Architecture for Vehicle Surround Environment Perception," In Proceedings of the International Workshop Intelligent Transportation, 2011.
[79] S. Pietzsch, T. D. Vu, O. Aycard, T. Hackbarth, N. Appenrodt, J. Dickmann, and B. Radig, !'Results of a Precrash Application Based on Laser Scanner and Short-Range Radars,!( IEEE Transactions on Intelligent Transportation Systems, vol. 10, pp. 584!V593, 2009.
[80] H. Qiu, F. Ahmad, F. Bai, M. Gruteser, and R. Govindan, !'AVR: Augmented vehicular reality,!( In Proceedings of the 16th Annual International Conference on Mobile Systems, Applications, and Services, pp. 81!V95, 2018.
[81] N. Kaempchen, M. B!Luhler, and K. Dietmayer, !'Feature-Level Fusion for Free-Form Object Tracking using a Laser Scanner and Video,!( IEEE Intelligent Vehicles Symposium, pp. 453!V458, 2005.
[82] M. M!Lahlisch, R. Schweiger, W. Ritter, and K. Dietmayer, !'Sensorfusion using Spatio-Temporal Aligned Video and Lidar for Improved Vehicle Detection,!( IEEE Intelligent Vehicles Symposium, pp. 424!V429, 2006.
[83] H. Takizawa, K. Yamada, and T. Ito, !'Vehicles Detection using Sensor Fusion,!( IEEE Intelligent Vehicles Symposium, pp. 238!V243, June 2004.
[84] N. Floudas, A. Polychronopoulos, O. Aycard, J. Burlet, and M. Ahrholdt, !'High-Level Sensor Data Fusion Approaches for Object Recognition in the Road Environment,!( IEEE Intelligent Vehicles Symposium, pp. 136!V141, 2007.