簡易檢索 / 詳目顯示

研究生: 田伊婷
Tien, Yi-Ting
論文名稱: 基於視覺的組合導航系統應用於登月小艇著陸模擬
Visual-Based Integrated Navigation System Applied to a Simulation of Lunar Module Landing
指導教授: 江凱偉
Chiang, Kai-Wei
學位類別: 碩士
Master
系所名稱: 工學院 - 測量及空間資訊學系
Department of Geomatics
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 92
中文關鍵詞: 基於視覺的導航系統慣性導航登月小艇PANGU直接稀疏測距
外文關鍵詞: Visual-based navigation system, Inertial navigation system, Lunar landing module, PANGU, Direct sparse odometry
相關次數: 點閱:100下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 近年來,太空科學技術的發展使許多國家開始進行月球探測。因此,登月小艇必須具有良好的定位和導航技術,以便探測器能夠在精確的目的地平穩著陸。由於月球上衛星定位系統尚未完善,使用低成本相機的視覺測距 (Visual Odometry, VO) 是幫助登月小艇找到目的地並精確著陸的另一種方法。在這項研究中,利用模擬的月球場景,實現了基於視覺的組合導航系統,其中整合了基於視覺的導航系統和慣性導航系統。
    基於視覺的導航系統分為兩部分: 基於視覺的絕對導航和基於視覺的相對導航。絕對導航部分使用攝影測量中的空中後方交會方法來計算相機的外方位參數,這意味著利用月球表面的影像,可以獲得相機在月球上的絕對位置和姿態。相對導航部分使用了視覺測距方法中的直接稀疏測距 (Direct Sparse Odometry, DSO) 演算法,用於在移動時獲得相機的相對位置和姿態。在基於視覺的導航系統中使用的模擬月球影像,包含影像資料庫及即時飛行影像,是由月球數值高程模型 (Digital Elevation Model, DEM) 及 盤古(Planet and Asteroid Natural Scene Generation Utility, PANGU) 軟體所獲得。
    慣性導航系統分為兩部分: 慣性測量單元 (inertial measurement unit, IMU) 資料模擬和卡爾曼濾波器 (Kalman filter, KF)。慣性測量單元資料部分使用真實位置資料反向推算,並加入雜訊,以獲得慣性測量單元原始觀測值。接著將此觀測值作為卡爾曼濾波器中狀態預測依據,而基於視覺的導航系統的輸出作為卡爾曼濾波器中狀態更新依據,最終獲得濾波後之成果。
    由於視覺測距法的誤差會隨著距離的推移而累積,絕對位置可以幫助相對導航提供初始值和更新的位置與姿態。加入慣性導航系統可以將基於視覺的導航成果進行濾波,獲得更好的準確性及穩定性,同時使用慣性測量單元補償影像無法過度靠近地表的缺點。通過基於視覺的導航系統和慣性導航系統這兩種方法的組合導航系統,只需使用登月小艇上相機拍攝的影像和慣性測量單元數據即可完成定位過程。在實驗結果中,80 公里飛行距離後的最終水平方向及垂直方向的位置誤差皆低於 30 公尺,姿態誤差皆小於0.5度,本實驗所提出的方法有效地減少了定位誤差,足以使登月小艇在目標目的地著陸。

    The development of space science and technology has led many countries to begin lunar exploration in recent years. Therefore, a lunar landing module must have good positioning and navigation technology so that the detector lands smoothly at a precise destination. Because satellite positioning is still not widely used on the moon, visual odometry (VO) with low-cost camera is an alternative way to help the lunar module find its destination and land accurately. In this study, a visual-based integrated navigation system, which integrates a visual-based navigation system and an inertial navigation system (INS), is implemented in the simulated lunar environment.
    The visual-based navigation system is divided into two parts: visual-based absolute navigation and visual-based relative navigation. The absolute navigation uses the space resection method of photogrammetry to calculate the Exterior Orientation Parameters (EOPs), which means that the absolute position and attitude of the camera on the moon are obtained using the images of the lunar surface. The relative navigation uses the Direct Sparse Odometry (DSO) algorithm from the VO method to obtain the relative position and attitude of the camera while moving. The simulated lunar image used in the visual-based navigation system, which includes the image database and the real-time flight image, is obtained by the lunar Digital Elevation Model (DEM) and the Planet and Asteroid Natural Scene Generation Utility (PANGU) software.
    The INS is divided into two parts: inertial measurement unit (IMU) data simulation and Kalman filter (KF). The simulation of the raw measurement of IMU is obtained by inversely calculating the reference position data and adding noise. Then, the measurement is used for state prediction in KF. The output of the visual-based navigation system is used for state updating in KF, and finally, the filtered result is obtained.
    Since the error of the visual ranging method accumulates as the distance increases, the absolute position provides relative navigation initial values and update position and attitude. The addition of INS filters visual-based navigation results for better accuracy and stability, while using IMU to compensate for the shortcomings of the images, that is, they cannot be placed too close to the lunar surface.
    Through the integrated navigation system which combines the visual-based navigation system and the INS, the navigation process is completed simply by using the images captured by the camera on the lunar landing module and the IMU data. The results of the experiment shows that the final horizontal and vertical position errors after 80 km flight distance are all less than 50 meters, and the attitude error is less than 0.5 degrees. The proposed method in this experiment effectively reduces the positioning error and is sufficient enough to allow the lunar landing module to land at the target destination.

    中文摘要 I Abstract III Acknowledgment V Table of Contents VII List of Figures IX List of Tables XII Chapter 1 Introduction 1 1.1 Background and Motivation 1 1.2 Thesis Outline 5 Chapter 2 Lunar Scenarios and Literature Review 7 2.1 Description of the Lunar Scenarios 7 2.1.1 Lunar Coordinate System 7 2.1.2 Lunar Landing Trajectory and Lunar Surface Images 9 2.2 Review of Existing Lunar Landing Technology 10 Chapter 3 Inertial Navigation Fundamental 19 3.1 Historical perspective of Inertial Navigation 19 3.2 Coordinate Frames 22 3.2.1 Inertial Frame (i-frame) 23 3.2.2 Earth Frame (e-frame) 23 3.2.3 Local level Frame (l-frame) 25 3.2.4 Body Frame (b-frame) 26 3.3 Inertial Navigation Mechanization Equations 28 3.4 Inertial Sensor Error Model 35 Chapter 4 Visual-Based Navigation Algorithms 38 4.1 Image Simulation from PANGU Software 38 4.2 Visual-Based Relative Navigation 40 4.3 Visual-Based Absolute Navigation 43 4.3.1 Speeded Up Robust Features (SURF) 44 4.3.2 Photogrammetry Space Resection (PSR) 47 4.3.3 Visual-Based Absolute Navigation Model 48 Chapter 5 Visual-Based Integrated Navigation System 52 5.1 Tight and Loose Coupling 52 5.2 Fusion of Visual-Based Navigation 54 5.3 Fusion of Visual-Based Navigation and INS 55 Chapter 6 Experiment Settings 59 6.1 Flight Path Region 59 6.2 Image Data Setting 61 6.3 Inertial System 65 6.3.1 IMU Observation 65 6.3.2 Filter Parameters 66 Chapter 7 Results and Analysis 70 7.1 Results and Analysis of Position and Velocity 70 7.2 Results and Analysis of Attitude 82 Chapter 8 Conclusions and Future Work 86 8.1 Conclusions 86 8.2 Future Works 86 References 88

    [1] Adams D., Criss, T. B., & Shankar, U. J. (2008). Passive optical terrain relative navigation using APLNav. IEEE Aerosp. Conf.
    [2] Barbour, N. M. (2011). Inertial Navigation Sensors. Low-Cost Navigation Sensors and Integration Technology, RTO-EN-SET-116. NATO Science and Technology Organization.
    [3] Bay, H., Ess A., Tuytelaars, T., & Gool, L. V. (2008). SURF: Speeded Up Robust Features. Computer Vision and Image Understanding (CVIU), 110(3), pp. 346–359.
    [4] Bilodeau, V. S., Clerc, S., Drai, R., & Lafontaine, J. D. (2014). Optical Navigation System for Pin-Point Lunar Landing. IFAC Proceedings Volumes, 47(3), pp. 10535-10542.
    [5] Cheng, Y. & Ansar, A. (2005). Landmark Based Position Estimation for Pinpoint Landing on Mars. Proceedings of the 2005 IEEE International Conference on Robotics and Automation (ICRA), Barcelona, Spain, pp. 4470-4475.
    [6] Cheng, Y., Johnson, A., & Matthies, L. (2005). MER-DIMES: A planetary landing application of computer vision. In Conference on Computer Vision and Pattern Recognition, 2005 (CVPR 2005) IEEE Computer Society, San Diego, CA (Vol. 1, pp. 806–813).
    [7] Chiang, K. W. (2004). INS/GPS Integration Using Neural Networks for Land Vehicular Navigation Applications.
    [8] Chu, C. C. (2006). Development of advanced entry, descent, and landing technologies for future Mars missions. In IEEE Aerospace Conference, Big Sky, MT.
    [9] de Lafontaine, J., Neveu, D., Lebel, K., Tripp, J., Martin, E., Ulitsky, A., Carr, R., & Janes, S. (2005). Autonomous hazard-avoidance landing on the moon: Adaptation and demonstration of the LAPS system. In Seventh International Conference on the Exploration and Utilization
    of the Moon, Toronto, Canada.
    [10] Dusha, D., & Mejías, L. (2012). Error analysis and attitude observability of a monocular GPS/visual odometry integrated navigation filter. Int. J. Robot. Res., 31, pp. 714-737.
    [11] Engel, J., Usenko, V., & Cremers, D. (2016). A Photometrically Calibrated Benchmark For Monocular Visual Odometry. In arXiv:1607.02555.
    [12] Engel, J., Koltun, V., & Cremers, D. (2017). Direct Sparse Odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence.
    [13] Frapard, B., Polle, B., Flandin, G., Bernard, P., V´etel, C., Sembely, X., & Mancuso, S. (2003). Navigation for planetary approach and landing. In K. Fletcher & R. A. Harris (Eds.), Proceedings of the 5th ESA International Conference on Spacecraft Guidance, Navigation and Control Systems (ESA SP-516), Frascati, Italy (p. 159). Noordwijk: ESA.
    [14] Grewal, M. S., Weill, L. R., & Andrews, A. P. (2001). Global Positioning Systems Inertial Navigation and Integration, John Wiley & Sons, Inc., New York.
    [15] Janscheck, K., Techernykh, V., & Beck, M. (2006). Performance analysis for visual planetary landing navigation using optical flow and DEM matching. AIAA Guidance, Navigation and Control Conference Exhibit.
    [16] Johnson, A., & Montgomery, J. (2008). Overview of Terrain Relative Navigation Approaches for Precise Lunar Landing. In Proceedings of the IEEE Aerospace Conference, Big Sky, Mont, USA.
    [17] Lowe, D. (1999). Object recognition from local scale-invariant features. In ICCV.
    [18] Mourikis, A. I., Trawny, N., Roumeliotis, S. I., Johnson, A. E., & Matthies, L. (2007). Vision-aided inertial navigation for precise planetary landing: Analysis and experiments. Proceedings of Robotics: Scientific Systems, Atlanta, GA
    [19] Mourikis, A., Trawny, N., Roumeliotis, S., Johnson, A., Ansar A., & Matthies, L. (2009). Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Transactions on Robotics, 25(2), pp. 264-280.
    [20] Muller P. M. (2016). Optical flow and deep learning based approach to visual odometry. PhD Thesis, Rochester Institute of Technology, USA.
    [21] NASA, (2008). A Standardized Lunar Coordinate System for the Lunar Reconnaissance Orbiter and Lunar Datasets. LRO Project and LGCWG White Paper. Online available at https://lunar.gsfc.nasa.gov/library/ LunCoordWhitePaper-10-08.pdf
    [22] NASA, (2017). Moon Fact Sheet. https://nssdc.gsfc.nasa.gov/planetary/factsheet/moonfact.html
    [23] Nister D., Naroditsky O., & Bergen J. (2004). Visual odometry. In Proc. IEEE Intl. Conference on Computer Vision and Pattern Recognition (CVPR ’04), 1, pp. 652-659.
    [24] Pham, B. V., Lacroix, S., Devy, M., Drieux, M., & Voirin, T. (2010). Landmark constellation matching for planetary lander absolute localization. International Conference on Computer Vision Theory and Applications, Angers, France.
    [25] Pham, B. V., Lacroix, S., Devy, M., Voirin, T., Drieux, M., & Bourdarias, C. (2010). Fusion of absolute vision-based localization and visual odometry for spacecraft pinpoint landing. International Astronautical Congress.
    [26] Pham, B. V. (2010). Vision-based absolute navigation for interplanetary spacecraft descent and landing. Système de navigation absolue pour l'atterissage d'une sonde interplanétaire. Diss. ISAE.
    [27] Pham, B. V., Lacroix, S., & Devy, M. (2012). Vision-Based Absolute Navigation for Descent and Landing. Journal of Field Robotics, 29(4), pp. 627-647.
    [28] Rogers, R. M. (2003). Applied Mathematics in Integrated Navigation Systems, Second Edition. American Institute of Aeronautics and Astonautics, Inc.
    [29] Romaniuk, S., & Gosiewski, Z. (2014). Kalman Filter Realization for Orientation and Position Estimation on Dedicated Processor. Acta Mechanica et Automatica, 8(2), pp. 88-94.
    [30] Schops, T., Engel, J., & Cremers, D. (2014). Semi-dense visual odometry for AR on a smartphone. IEEE International Symposium on Mixed and Augmented Reality.
    [31] Singhal, T., Harit A., & Vishwakarma, D. N. (2012). Kalman filter implementation on an accelerometer sensor data for three state estimation of a dynamic system, International Journal of Research in Engineering and Technology, 1(6), ISSN 2277-4378.
    [32] Tech Differences, (2017). Difference Between Loosely Coupled and Tightly Coupled Multiprocessor System (with Comaprison Chart) - Tech Differences. https://techdifferences.com/difference-between-loosely-coupled-and-tightly-coupled-multiprocessor-system.html
    [33] Terui, F., Ogawa, N., Oda, K., & Uo, M. (2010). Image based navigation and guidance for approach phase to the asteroid utilizing captured images at the rehearsal approach manuscript template and style guide. In 61st International Astronautical Congress, Prague, CZ.
    [34] Theil, S., Ammann, N., Andert, F., Franz, T., Krüger, H., Lehner, H., Lingenauber, M., Lüdtke, D., Maass, B., Paproth, C., & Wohlfeil, J. (2018). ATON (Autonomous Terrain-based Optical Navigation) for exploration missions: recent flight test results. CEAS Space Journal.
    [35] Trawny, N., Mourikis, A. I., Roumeliotis, S. I., Johnson, A. E., & Montgomery, J. F. (2007). Vision-aided inertial navigation for pinpoint landing using observations for mapped landmarks. Journal of Fields Robotics, 24(5), pp. 357-378.
    [36] VectorNav Technologies, (2008). IMU and INS - VectorNav Library, https://www.vectornav.com/support/library/imu-and-ins
    [37] Welch G., & Bishop G. (1995). An introduction to the kalman filter. Technical report, UNC-CH Computer Science Technical Report 95041.
    [38] Wertz, J. R. (1978). Spacecraft Attitude Determination and Control. Kluwer Academic Publishers, Dorbrecht.
    [39] Xie, F., Liu, J., Li, R., Jiang, B., & Qiao, L. (2014). Performance analysis of a federated ultra-tight global positioning system/inertial navigation system integration algorithm in high dynamic environments. Journal of Aerospace Engineering, 229, pp.56-71.

    下載圖示 校內:2024-08-30公開
    校外:2024-08-30公開
    QR CODE