簡易檢索 / 詳目顯示

研究生: 容丕達
Iong, Pei-Tat
論文名稱: 以影像辨識導引UAV飛行
UAV Guidance Using Image Recognition
指導教授: 陳世雄
Chen, Shih-Hsiung
學位類別: 博士
Doctor
系所名稱: 工學院 - 航空太空工程學系
Department of Aeronautics & Astronautics
論文出版年: 2008
畢業學年度: 96
語文別: 英文
論文頁數: 183
中文關鍵詞: 導引影像辨識無人飛行載具
外文關鍵詞: UAV, Guidance, Image Recognition
相關次數: 點閱:93下載:17
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究之目標為發展無人飛行載具所使用之單攝影機影像導引系統,由機上攝影機所拍攝的影像中自動搜尋一個已知顏色及形狀的特定目標,再由影像中所提供之資訊導引無人飛行載具自動飛向該目標。本系統只需單獨一台攝影機即可在機上獨力完成自動辨識及導引,整個影像導引系統的工作包含物件辨識及影像導引兩部份。在物件辨識的部份,本研究使用HSV色彩空間描述目標物顏色特性,以顏色特性過濾出符合條件的像素,再通過連接物件標記法使相鄰像素被標記為同一物件,本研究使用不變矩描述目標物形狀特性,通過計算及比較各被標記物件之不變矩與已知目標物之不變矩,兩者最接近的一組會被系統認定為形狀特性最符合之目標物。在影像導引的部份,當目標物被辨識後,於影像平面可收集之目標物資訊如位置、面積及旋轉角度都會被計算,以上資訊再通過擴張卡爾曼濾波器用於估測無人飛行載具與目標物間之相對位置及無人飛行載具之速度與姿態,影像導引系統再根據以上資訊使用比例控制導引無人飛行載具自動飛向目標。本研究進行了飛行模擬、地上實驗及空中實驗。飛行模擬使用Runge-Kutta演算法積分飛行力學統馭方程式,並使用針孔相機模組計算目標物於影像平面上之位置、面積及旋轉角度,飛行模擬用以測試擴張卡爾曼濾波器及飛行控制系統的運作,在加入雜訊於影像平面相關資訊後,模擬結果顯示影像導引系統能正常運作並導引無人飛行載具飛向目標,飛行模擬結果並顯示了橫向相對位移之估測較相對高度之估測有較佳之收歛速度,而相對高度及俯仰角之估測對Plant Noise Covariance Matrix參數之設定較為敏感,此外,當影像導引系統短暫失去目標時,姿態角之估算容易受角速度方程式阻泥值計算方式之影響而有較大誤差。地上及空中實驗使用Senior Telemaster無人飛行載具套件安裝機上攝影機及電腦,並使用印有紅色十字的白色旗幟作為目標物進行多項測試,其中包括物件辨識運算時間測試、不變矩量測測試、目標物滾轉角量測測試及影像導引測試,實驗結果顯示影像導引系統能以平均約0.4秒的運算速度辨認目標並導引無人飛行載具。

    The objective of this research is to develop a single camera vision guidance system for UAV. The system searches and identifies a target object with known color and shape from image captured by onboard camera. The UAV is then guided towards this target object automatically. The vision guidance system only requires a single camera to identify and navigate towards the target automatically. The tasks of the vision guidance system consist of object recognition and vision guidance. In this research, HSV color space is utilized for describing the color feature of the target object. Pixels with similar color feature with the target object will be remained after color filtering. Connected component labeling algorithm is then utilized to label connected pixels as target object candidates. Moment Invariants are utilized for describing the shape feature of the target object in this research. The moment invariants of each target object candidate are calculated and compared to the moment invariants of the target object. The system will identify the object candidate with the most similar shape feature as the target object. Once the target object is identified, information collected from image plane including position, area and rotation angle are calculated. This information is then processed by Extended Kalman Filter to estimate the relative position of the UAV and the target object. Speed and attitudes of the UAV are also estimated. The vision guidance system then guides the UAV towards the target object automatically through proportional control. Flight simulation, ground tests and flight tests are performed in this research. Simulation results show that the vision guidance system can guide the UAV towards the target object under noise affecting image plane information. Flight simulation results also show that the convergence speed of estimated relative lateral displacement is faster than the convergence speed of estimated relative altitude. The estimations of estimated pitch angle and relative altitude are more sensitive to the parameters of plant noise covariance matrix. In addition, when the target is lost from the vision guidance system temporality, the errors of estimated attitudes are bigger due to low damping in the simplified angular rates equations. A Senior Telemaster aircraft model kit installed with onboard camera and computer is used for ground test and flight test. A white flag printed with a red cross is used as the target object. Ground tests and flight tests include test of object recognition computation time, test of calculations of moment invariant, test of calculations of target object rotation angle and test of vision navigation. Ground tests and flight tests show that the average cycle time of the system is 0.4 seconds. The target object can be recognized by the vision guidance system and the UAV is guided towards the target object effectively. Directions of future work for improving the vision guidance system are also discussed.

    CONTENTS ABSTRACT IN CHINESE i ABSTRACT ix ACKNOWLEDGEMENTS xi LIST OF TABLES xiv LIST OF FIGURES xvi NOMENCLATU xxii Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Literature review 2 1.3 Research Overview 8 Chapter 2 Object Recognition 13 2.1 Introduction 13 2.2 Color Spaces and Color Filtering 14 2.3 Connected Component Labeling 19 2.4 Moment Invariants and Object Matching 29 Chapter 3 Image Guidance 33 3.1 Introduction 33 3.2 Coordinate Systems 34 3.3 Camera Model 38 3.4 State Estimation 56 3.5 Automatic Flight control 70 Chapter 4 Experiment Devices 79 4.1 Introduction 79 4.2 Hardware 79 4.2.1 Aircraft 79 4.2.2 Onboard Computer 82 4.2.3 Onboard Camera 83 4.2.4 Control Switch 84 4.2.5 GPS Receiver 87 4.2.5 Target Object 88 4.3 Software 88 Chapter 5 Experiment Results and Discussions 95 5.1 Introduction 95 5.2 Connected Component Labeling Algorithm Test 95 5.3 Color Features and Moment Invariants Test 99 5.4 Rotation Angle Detection Test 111 5.5 Flight Simulation 116 5.6 GPS Receiver Test 152 5.7 Flight Test 154 Chapter 6 Conclusion 172 REFERENCES 178 PUBLICATION LIST 182 VITA 183

    [1] Shakernia, O., Ma, Y., Koo, T. J., Hespanha, J., Sastry, S. S., “Vision Guided Landing of an Unmanned Air Vehicle”, Proceedings of the 38th Conference on Decision and Control, Phoenix, USA, 1999, pp. 4143-4148.
    [2] Sharp, C. S., Shakernia, O., Sastry, S. S., “A Vision System for Landing an Unmanned Aerial Vehicle”, Proceedings of IEEE International Conference on Robotics and Automation, Seoul, Korea, 2001, pp. 1720-1727.
    [3] Shakernia, O., Sharp, C. S., Vidal, R., Shim, D. H., Ma, Y. Sastry, S., “Multiple View Motion Estimation and Control for Landing an Unmanned Aerial Vehicle”, IEEE Conference of Robotics and Automation, Washington D.C., USA, 2002, pp. 2793-2798.
    [4] Saripalli, S., Montgomery, J. F., Sukhatme, G. S., “Vision-based Autonomous Landing of an Unmanned Aerial Vehicle”, Proceedings of IEEE International Conference on Robotics and Automation, Washington D.C., USA, 2002, pp. 2799-2784.
    [5] Saripalli, S., “Visually Guided Landing of an Unmanned Aerial Vehicle”, IEEE Transactions on Robotics and Automation, Vol. 19, No. 3, 2003, pp. 371-381.
    [6] Saripalli, S., Sukhatme, G. S., “Landing on a Moving Target using an Autonomous Helicopter”, Proceedings of the International Conference on Field and Service Robotics, Mt Fuji, Japan, 2003.
    [7] Johnson, E. N., Proctor, A. A., Ha, J., Tannenbaum, A. R., “Visual Search Automation for Unmanned Aerial Vehicles”, IEEE Transactions on aerospace and electronics systems Vol. 41 No.1, 2005, pp. 219-232.
    [8] Johnson, E. N., Schrage, D. P., “The Georgia Tech Unmanned Aerial Research Vehicle: GTMax”, AIAA Guidance, Navigation and Control Conference, Austin, USA, 2003.
    [9] Wu, A. D., Johnson, E. N., Proctor, A. A., “Vision-Aided Inertial Navigation for Flight Control”, Journal of Aerospace Computing, Information, and Communication, Vol. 2, No.9, 2005, pp. 348-360.
    [10] Zhang, H., Ostrowski, J. P., “Visual Servoing with Dynamics: Control of an Unmanned Blimp”, Proceedings of the 1999 IEEE International Conference on Robotics & Automation, Detroit Michigan, USA, 1999, pp. 618-623.
    [11] Campos, M. F. M., Coelho, L. D. S., “Autonomous Dirigible Navigation Using Visual Tracking and Pose Estimation”, Proceedings of the 1999 IEEE International Conference on Robotics & Automation, Detroit Michigan, USA, 1999, pp.2584-2589.
    [12] He, Z., Iyer, V., Chandler, P. R., “Vision-based UAV Flight Control and Obstacle Avoidance”, IEEE American Control Conference, Minneapolis, UAV, 2006.
    [13] Ettinger, S. M., Nechyba, M. C., Ifju, P. G., Waszak, M., “Vision-Guided Flight Stability and Control for Micro Air Vehicles”, Proceedings of the 2002 IEEE/RSJ International. Conference on Intelligent Robots and Systems EPFL, Lausanne, Switzerland, 2002, pp. 2134-2140.
    [14] Causey, R. S., “A Lateral Vision-Based Control Autopilot for Micro Air Vehicles using a Horizon Detection Approach”, Master’s Thesis, Aerospace Engineering Dept., University of Florida, Florida, USA, 2003.
    [15] Kehoe, J. J., “Autopilot Development for a Micro Air Vehicle Using Vision-Based Attitude Estimation”, Master’s Thesis, Aerospace Engineering Dept., University of Florida, Florida, USA, 2004.
    [16] Kehoe, J. J., Causey, R., Abdulrahim, M., Lind, R., “Waypoint Navigation for a Micro Air Vehicle using Vision-Based Attitude Estimation”, Proceedings of the 2005 AIAA Guidance, Navigation, and Control Conference, San Francisco, USA, 2005.
    [17] Wagter, C. D., Proctor, A. A., Johnson, E. N., “ Vision-only Aircraft Flight Control”, AIAA Digital Avionics Conference, Indianapolis, USA, 2003.
    [18] Proctor, A. A., Johnson, E. N., “Vision-only Aircraft Flight Control Methods and Test Results”, AIAA Guidance, Navigation, and Control Conference and Exhibit, Providence, Rhode Island, 2004.
    [19] Barrows, G. L., Neely, C., “Mixed-mode VLSI Optic Flow Sensors for In-flight Control of a Micro Air Vehicle”, SPIE 45th Annual Meeting, San Diego, USA, 2000.
    [20] Roberts, P. J., Walker, R. A., “Fixed Wing UAV Navigation and Control through Integrated GNSS and Vision”, AIAA Guidance, Navigation and Control Conference and Exhibit, San Francisco, USA, 2005.
    [21] Frew, E., McGee, T., Kim, Z. W., Xiao, X., Jackson, S. Morimoto, M., Rathinam, S., Padial, J., Sengupta, R., “Vision-based Road-following using a Small Autonomous Aircraft”, Proceedings of IEEE Aerospace Conference, Big Sky, USA, March 2004. pp. 3006-3015.
    [22] Rathinam, S., Kim, Z. W., Sengupta, R., “Vision-based Following of Structures using an Unmanned Aerial Vehicle”, Technical Report # UCB-ITS-RR-2006-1, Institute of Transportation Studies, University of California, Berkeley, USA, 2006.
    [23] Driessen, J., “Object Tracking in a Computer Vision based Autonomous See-and-Avoid System for Unmanned Aerial Vehicles”, Master’s Thesis, Computer Science, School of Vehicle Engineering, Royal Institute of Technology, Stockholm, Sweden, 2004.
    [24] Hazeldene, A., Sloan, A., Wilkin, C. Price, A., “In-Flight Orientation, Object Identification and Landing Support for an Unmanned Air Vehicle”, 2nd International Conference of Autonomous Robots and Agents, Palmerston North, New Zealand, 2004.
    [25] Sloan, A. W., Price, A. R. “Visual Determination of UAV Attitude In-Flight”, Eleventh Australian International Aerospace Congress, Melbourne, Australia, 2005.
    [26] Iong, P. T., Chen, S. H., “Study of Unmanned Aerial Vehicle Autopilot using GPS Signal”, Master’s Thesis, Aeronautics and Astronautics Dept., National Cheng Kung University, Tainan, Taiwan, 1998.
    [27] Tai, M. S., Chen, S. H., “Flight Simulation and Model Test of GPS Guided Bomb”, Master’s Thesis, Aeronautics and Astronautics Dept., National Cheng Kung University, Tainan, Taiwan, 2000.
    [28] Chen, C. H., Chen, S. H., “The System Design and Flight Test of a GPS Guided UAV”, Master’s Thesis, Aeronautics and Astronautics Dept., National Cheng Kung University, Tainan, Taiwan, 2001.
    [29] Smith, A. R., “Color Gamut Transform Pairs”, ACM SIGGRAPH Computer Graphics archive, Vol. 12, Issue 3, 1978, pp. 12-19.
    [30] Haralick, R. M., “Some Neighborhood Operators”, Real-Time Parallel Computing Image Analysis, Plenum Pub Corp, New York, USA, 1981.
    [31] Rosenfeld, A., Pfaltz, J. L., “Sequential Operations in Digital Picture Processing”, Journal of the Association for Computing Machinery, Vol. 13, 1966, pp. 471-494.
    [32] Hu, M. K., “Visual Pattern Recognition by Moment Invariants”, IRE Transactions on Information Theory, Vol. IT-8, 1962, pp. 179-187.
    [33] Hsia, T. C., “A note on Invariant Moments in Image Processing”, IEEE Transactions on Systems, Man and Cybernetics, Vol. SMC-11, No. 12, 1981, pp. 831-834.
    [34] Teh, C. H., Chin, R. T., “On Digital Approximation of Moment Invariants”, Computer Vision, Graphics, and Image Processing, Vol. 33, 1986, pp.318-326.
    [35] Haralick, R. M., Shapiro, L. G., Computer and Robot Vision, Addison-Wesley Longman Publishing Co., Inc, Boston, USA, 1992.
    [36] Shapiro, L. G., Stockman, S. C., Computer Vision, Prentice-Hall Inc., New Jersey, USA, 2001.
    [37] “HSL and HSV”, http://en.wikipedia.org/wiki/HSL_and_HSV.
    [38] Stevens, B. L., Lewis, F. L., Aircraft Control and Simulation 1st edition, John Wiley & Sons, Inc., New York, USA, 1992.
    [39] Heikkilä, J., Silvén, O., “A Four-step Camera Calibration Procedure with Implicit Image Correction”, Proceedings of IEEE Computer Vision and Pattern Recognition, Puerto Rico, 1997, pp. 1106-1112.
    [40] Bouguet, J. Y., “Camera Calibration Toolbox for Matlab”, http://www.vision.caltech.edu/bouguet/calib_doc/
    [41] Zhang, Z., “Flexible Camera Calibration By Viewing a Plane from Unknown Orientations”, Seventh International Conference on Computer Vision, Kerkyra, Greece, 1999, pp.666-673.
    [42] Grewal, M. S., Andrews, A. P., Kalman Filtering Theory and Practice Using MATLAB Second Edition, John Wiley & Sons, Inc., New York, USA, 2002.
    [43] speedMOPSlcdPM User’s Guide Document Revision 0.9, Kontron Embedded Computers AG, 2005.
    [44] Disk on Module Hi-Speed DH Series Datasheet Revision A.0, Power Quotient International Co., Ltd., 2007.
    [45] Julius, S. B., Allan, G. P., Random Data Analysis and Measurement Procedures Third Edition, John Wiley & Sons, Inc., New York, USA, 2000.

    下載圖示 校內:立即公開
    校外:2008-09-05公開
    QR CODE