| 研究生: |
陳建綸 Chen, Chien-Lun |
|---|---|
| 論文名稱: |
基於視覺導航修正全球定位系統之旋翼機自動定點降落 Autonomous Landing For Quadcopter Based On Visual Navigation Modifying Global Positioning System |
| 指導教授: |
賴維祥
Lai, Wei-Hsiang |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 航空太空工程學系 Department of Aeronautics & Astronautics |
| 論文出版年: | 2019 |
| 畢業學年度: | 107 |
| 語文別: | 中文 |
| 論文頁數: | 98 |
| 中文關鍵詞: | 無人機 、旋翼機 、電腦視覺 、視覺導航 、ArUco圖案 、航行算則 |
| 外文關鍵詞: | quadcopter, UAV, computer cision, visual navigation ArUco marker, Aviation Formulary |
| 相關次數: | 點閱:214 下載:21 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本研究開發一套基於電腦視覺修正全球地位系統的方法,精準降落於地面目標物ArUco圖案。四旋翼機自動降落的核心演算法分為電腦視覺及航行算則:電腦視覺定義目標物座標系將影像的對應點透過內方位參數投影到相機座標系,再以EPnP方法將世界座標系下參考點以及相機坐標系的投影點做3D與2D的轉換後,得到相機三維位置與姿態,便可求出相對目標物的相對距離;將上述得到之相對距離以航行算則計算成新經緯度誤差,便可與原始經緯度相增減,即可使無人機前往此點且自動降落。以四旋翼機掛載處理器及相機在單純和複雜環境、航點式自動降落及全程自動降落進行實測:單純環境中分別分析兩種ArUco圖案邊長尺寸的高度、降落誤差、經緯度變化的實驗數據並討論成果,不同的邊長尺寸影響著識別距離、影響在低高度時持續識別圖案的時間長短而導致降落誤差;複雜環境中為兩種相同邊長尺寸、不同ID的ArUco圖案與複雜圖案,實測通過多種圖案的安全檢測之視覺降落,分析飛行路徑的經緯度變化以及降落誤差。實測航點式自動降落,整合航點規劃與自動降落並實際飛行,並討論飛行中經緯度不同曲線之變化。全程自動降落則在不使用LAND模式下,加入小的邊寸尺寸ArUco圖案使無人機持續修正GPS至成功降落,並探討降落誤差的優化。
This thesis is to develop a method based on computer vision modifying global positioning system and the drone lands accurately on the ArUco pattern. The algorithm of the autonomous landing used for quadcopter is divided into two parts which contains computer vision and aviation formulary: computer vision deal with project correspondences from image coordinate to camera coordinate by using intrinsic matrix. Then using EPnP method to convert two-dimension projection points of the image coordinate system into three-dimension reference points of the world coordinate system. After obtaining the three-dimensional position and pose estimation of the camera, the relative distance between target and camera can be also obtained. In aviation formulary, the latitude and longitude error can be calculated by the relative distance. New latitude and longitude can be obtained by adding and subtracting the original and error. Finally, the drone can follow the new latitude and longitude and land automatically.
The quadcopter mounted processor and camera experimented in simple and complex environment, waypoint automatic landing and full automatic landing: In the simple environment, the experimental data of the height, landing error, latitude and longitude of the two size ArUco patterns are analyzed separately, and the results are discussed. The factor of different sizes affects the recognition distance and affects the landing error depending on the duration of action time when the pattern is continuously recognized at low height. In the complex environment, there are two same size, different ID ArUco patterns and complex patterns. The visual landslide of the safety detection of various patterns is measured and the changes of the latitude, longitude and landing error are analyzed. In waypoint automatic landing, integrating waypoint planning and automatic landing, experimenting in actual flight and discussing the changes in different latitude and longitude curves during flight. In the automatic landing, adding a small size ArUco pattern allows the drone to continuously correct the GPS before successfully landing without using LAND mode and explore the optimization of the landing error.
[1] G. S. Pannu, M. D. Ansari, and P. Gupta, "Design and implementation of autonomous car using Raspberry Pi," International Journal of Computer Applications, vol.113, no.9, 2015, pp.22-29.
[2] W. F. Abaya, J. Basa, M. Sy, A. C. Abad, and E. P. Dadios, "Low cost smart security camera with night vision capability using Raspberry Pi and OpenCV," in "2014 International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2014, pp.1-6.
[3] S. Ward, J. Hensler, B. Alsalam, and L. F. Gonzalez, "Autonomous UAVs wildlife detection using thermal imaging, predictive navigation and computer vision," in 2016 IEEE Aerospace Conference, 2016, pp. 1-8.
[4] M. F. Sani and G. Karimian, "Automatic navigation and landing of an indoor AR. drone quadrotor using ArUco marker and inertial sensors," in 2017 International Conference on Computer and Drone Applications (IConDA), 2017, pp. 102-107.
[5] J. L. Sanchez-Lopez, J. Pestana, P. de la Puente, R. Suarez-Fernandez, and P. Campoy, "A system for the design and development of vision-based multi-robot quadrotor swarms," in 2014 International Conference on Unmanned Aircraft Systems (ICUAS), 2014, pp.640-648.
[6] B. H. Y. Alsalam, K. Morton, D. Campbell, and F. Gonzalez, "Autonomous UAV with Vision Based on-board Decision Making for Remote Sensing and Precision Agriculture," in 2017 IEEE Aerospace Conference, 2017, pp. 1-12.
[7] 謝秉勳, "地面目標物識別及相對定位視覺演算法," 成功大學航空太空工程學系學位論文, pp. 1-68, 2018.
[8] Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, 2000.
[9] V. Lepetit, F. Moreno-Noguer, and P. Fua, "Epnp: An accurate o (n) solution to the pnp problem," International Journal of Computer Vision, vol. 81, no. 2, p. 155, 2009.
[10] E. Williams, "Aviation Formulary V1. 42," Aviation, vol.1, pp. 42, 2011.
[11] Ardupilot, http://ardupilot.org/
[12] Mission Planner, http://ardupilot.org/planner/
[13] OpenCV, https://opencv.org/
[14] ArUco, https://docs.opencv.org/3.1.0/d5/dae/tutorial_aruco_detection.html