| 研究生: |
李建德 Lee, Jian-De |
|---|---|
| 論文名稱: |
移動載具之環繞視覺影像系統 An Around-View Camera System for Moving Vehicle |
| 指導教授: |
譚俊豪
Tarn, Jiun-Haur |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 航空太空工程學系 Department of Aeronautics & Astronautics |
| 論文出版年: | 2021 |
| 畢業學年度: | 109 |
| 語文別: | 中文 |
| 論文頁數: | 53 |
| 中文關鍵詞: | 車輛輔助系統 、相機環繞視覺 、同時定位與地圖構建 、內方位參數校正 、外方位參數校正 |
| 外文關鍵詞: | ADAS, around-view cameras, SLAM, intrinsic calibration, extrinsic calibration |
| 相關次數: | 點閱:73 下載:12 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
至今,關於自動駕駛的研究與論文,相較於過去十年,技術更為成熟與普及。在 SAE level 2 中,載具可自動執行轉向與加速的動作,駕駛人仍然需要監控整個駕駛過程,並且可以隨時主導駕駛權,本篇論文稱此為 ADAS(Advanced Driver Assistance Systems) 車用輔助系統。
移動載具之環繞視覺影像系統是 ADAS 車用輔助系統,此系統可已得到一張俯視的環繞視圖,並且輔助駕駛將載具停入停車格內。移動載具之環繞視覺影像系統是由四到六個相機組成,並且安裝在載具的四方。利用 RGB 相機的相片,使用SLAM同時定位與地圖構建的方法,得到相機在空間中的外方位參數。一旦作者有了相機與相機之間的相對姿態,可以利用這些資訊,經過影像處理,得到俯視的環繞視圖,隨後即可輔助駕駛將載具停入停車格內。在本篇論文中,作者將詳細的介紹移動載具之環繞視覺影像系統之流程。
Nowadays, the research about autonomous driving is more famous than a decade ago. In SAE level 2, the vehicle can perform steering and acceleration. The human still monitors all tasks and can take control at any time, which was known as ADAS(Advanced Driver Assistance Systems) technology.
An around-view camera system is an ADAS technology that can help the driver to have a top-view surrounding image, and assist the driver to drive into the parking space. The around-view system is consists of four to six cameras that were installed around the vehicle. From these RGB camera images, we use SLAM-based methods to calculate camera extrinsic parameters. Once we get each camera’s relative poses and do the image process, a composite bird-eye view of the vehicle is shown on the monitor and assists the driver during parking. In this thesis, the process of getting an around-view image will be introduced in detail.
[1] B. Zhang et al., "A Surround View Camera Solution for Embedded Systems," in 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, 23-28 June 2014 2014, pp. 676-681, doi: 10.1109/CVPRW.2014.103.
[2] L. Heng, B. Li, and M. Pollefeys, "CamOdoCal: Automatic intrinsic and extrinsic calibration of a rig with multiple generic cameras and odometry," in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 3-7 Nov. 2013 2013, pp. 1793-1800, doi: 10.1109/IROS.2013.6696592.
[3] G. Carrera, A. Angeli, and A. J. Davison, "SLAM-based automatic extrinsic calibration of a multi-camera rig," in 2011 IEEE International Conference on Robotics and Automation, 9-13 May 2011 2011, pp. 2652-2659, doi: 10.1109/ICRA.2011.5980294.
[4] C. Kühling, "Fisheye Camera System Calibration for Automotive Applications," ed: Freie Universität Berlin, 2017.
[5] N. Hall, "Aircraft Rotations," 2015. [Online]. Available: https://www.grc.nasa.gov/WWW/K-12/airplane/rotations.
[6] OpenCV, "Camera Calibration." [Online]. Available: https://docs.opencv.org/master/dc/dbb/tutorial_py_calibration.html.
[7] MathWorks, "What Is Camera Calibration?." [Online]. Available: https://www.mathworks.com/help/vision/ug/camera-calibration.html?s_tid=mwa_osa_a.
[8] K. S. e. al., "Camera Calibration using OpenCV," 2020. [Online]. Available: https://learnopencv.com/camera-calibration-using-opencv/.
[9] L. Heng, "Toward Full Autonomy for Vision-Guided Robots: From Self-Calibration to Self-Directed Exploration," ETH Zurich, 2014.
[10] H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, "Speeded-Up Robust Features (SURF)," Computer Vision and Image Understanding, vol. 110, no. 3, pp. 346-359, 2008/06/01/ 2008, doi: https://doi.org/10.1016/j.cviu.2007.09.014.
[11] C. Stachniss, "Photogrammetry2," 2020. [Online]. Available: https://www.ipb.uni-bonn.de/photo2-2020/.
[12] L. Kneip, D. Scaramuzza, and R. Siegwart, "A novel parametrization of the perspective-three-point problem for a direct computation of absolute camera position and orientation," in CVPR 2011, 20-25 June 2011 2011, pp. 2969-2976, doi: 10.1109/CVPR.2011.5995464.
[13] B. Triggs, P. F. McLauchlan, R. I. Hartley, and A. W. Fitzgibbon, "Bundle adjustment—a modern synthesis," in International workshop on vision algorithms, 1999: Springer, pp. 298-372.
[14] X. Gao, Slam Book. 2016.
[15] K. Daniilidis and E. Bayro-Corrochano, "The dual quaternion approach to hand-eye calibration," in Proceedings of 13th International Conference on Pattern Recognition, 25-29 Aug. 1996 1996, vol. 1, pp. 318-322 vol.1, doi: 10.1109/ICPR.1996.546041.
[16] C. X. Guo, F. M. Mirzaei, and S. I. Roumeliotis, "An analytical least-squares solution to the odometer-camera extrinsic calibration problem," in 2012 IEEE International Conference on Robotics and Automation, 14-18 May 2012 2012, pp. 3962-3968, doi: 10.1109/ICRA.2012.6225339.
[17] G. H. Lee, B. Li, M. Pollefeys, and F. Fraundorfer, "Minimal solutions for the multi-camera pose estimation problem," The International Journal of Robotics Research, vol. 34, no. 7, pp. 837-848, 2015.
[18] N. Carlevaris-Bianco, A. K. Ushani, and R. M. Eustice, "University of Michigan North Campus long-term vision and lidar dataset," The International Journal of Robotics Research, vol. 35, no. 9, pp. 1023-1035, 2016.
[19] T. MathWorks, "MATLAB," 1992. [Online]. Available: https://www.mathworks.com/products/matlab.html.
[20] H. Mallot, H. Bülthoff, J. J. Little, and S. Bohrer, "Inverse Perspective Mapping Simplifies Optical Flow Computation and Obstacle Detection," Biological cybernetics, vol. 64, pp. 177-85, 02/01 1991, doi: 10.1007/BF00201978.
[21] S. Tuohy, D. O'Cualain, E. Jones, and M. Glavin, Distance determination for an automobile environment using Inverse Perspective Mapping in OpenCV. 2010, pp. 100-105.
[22] MathWorks, "Create 360° Bird's-Eye-View Image Around a Vehicle." [Online]. Available: https://www.mathworks.com/help/driving/ug/create-360-birds-eye-view-image.html.