簡易檢索 / 詳目顯示

研究生: 賴昭廷
Lai, Chao-Ting
論文名稱: 自主式水下載具機械手臂系統結合視覺伺服抓取目標物之研究
Research on Object Grasping Using Visual Servoing in an Autonomous Underwater Vehicle Manipulator System
指導教授: 王舜民
Wang, Shun-Min
學位類別: 碩士
Master
系所名稱: 工學院 - 系統及船舶機電工程學系
Department of Systems and Naval Mechatronic Engineering
論文出版年: 2025
畢業學年度: 113
語文別: 中文
論文頁數: 107
中文關鍵詞: 自主式水下載具機械手臂系統YOLO11 OBBPixhawkPythonGS-PID
外文關鍵詞: Autonomous Underwater Vehicle Manipulator System (AUVMS), YOLO11 OBB, Pixhawk 2.4.8, Python, Gain-Scheduled PID Controller (GS-PID)
相關次數: 點閱:75下載:6
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著海洋資源開發與深海任務需求持續升高,自主式水下載具(Autonomous Underwater Vehicle, AUV)因具備無需人員干預、高機動性與安全性,已逐漸成為海洋探測、設備檢修與環境監測等任務的重要工具。其應用場景涵蓋深海資源勘探、海底電纜與管道檢查、環境污染監測、海洋生物多樣性調查、以及水下基礎設施維護等。然而目前多數AUV在執行需操作性的任務時,仍仰賴人工遙控,導致自主性不足,特別在複雜或高風險水域中更顯限制。本研究的動機即源自觀察到國內現有水下載具機械手臂系統,在多感測整合與控制架構方面尚未充分發展,難以應對高度自主化的水下作業需求。
    為提升作業自動化與智慧化程度,本研究提出一套整合影像辨識、雙目測距、自主導航與視覺伺服控制的自主式水下載具機械手臂系統(Autonomous Underwater Vehicle Manipulator System, AUVMS)。系統具備六自由度推進能力,採用八顆推進器確保姿態與運動穩定,並將控制器、影像模組與電源配置於三個水密艙中。控制核心由Pixhawk 2.4.8與NVIDIA Jetson Orin NX組成,透過Python撰寫各模組通訊與控制邏輯。目標物辨識使用YOLO11 OBB 模型搭配雙目相機估算距離,並透過視覺伺服控制,使載具達成精準定位。在導航定位方面,系統整合都卜勒測速儀(DVL)與航位推測(Dead Reckoning)進行即時定位,並透過增益調變(Gain Scheduled, GS)PID控制器來進行航向角、深度與高度控制。當影像系統辨識到目標後,資訊經由MySQL傳輸至主控端,由三組獨立GS-PID控制器分別針對X、Y軸位置、角度與距離進行視覺伺服調整,達成高精度控制。
    本研究於國立成功大學系統及船舶機電工程學系之水槽進行多項實驗驗證。辨識測試中,YOLO11 OBB模型IoU@0.5的mAP為0.984,在混淆矩陣(Confusion Matrix)的類別辨識正確率皆大於90%,模型在能準確預測目標物角度。定位控制方面,X、Y軸定位誤差控制於 ±30像素內,目標物角度控制誤差於 ±15度,目標物距離控制誤差則約為 ±3公分,顯示穩定的視覺伺服控制。導航控制實驗亦證實系統能穩定執行定高導航控制。最終整合測試顯示,AUVMS能從定高導航搜索、視覺伺服定位與夾取回收任務,全部流程皆由系統自主完成,無需人工介入,展現出良好的任務執行能力與穩定性。本研究成功建立一套具備自主辨識與操作能力的AUVMS,展現其在複雜的環境下執行任務的可行性與實用性,未來可應用於海洋監測、設備維護、目標物回收等多項場景,對提升水下技術發展具重要意義。

    As the demand for ocean resource exploration and deep-sea missions continues to rise, Autonomous Underwater Vehicles (AUVs) have become essential tools for underwater tasks due to their autonomy, high mobility, and safety. AUVs are widely used in deep-sea resource exploration, submarine cable and pipeline inspection, environmental pollution monitoring, marine biodiversity surveys, and underwater infrastructure maintenance. However, most current AUVs still rely on manual remote control when performing operational tasks, resulting in limited autonomy, especially in complex or high-risk environments.
    This research is motivated by the observation that existing domestic underwater vehicle-manipulator systems have yet to achieve sufficient development in multi-sensor integration and control architecture, making them inadequate for highly autonomous underwater operations.
    To enhance the level of automation and intelligence in underwater tasks, this study proposes an Autonomous Underwater Vehicle Manipulator System (AUVMS) that integrates image recognition, stereo vision-based distance estimation, autonomous navigation, and visual servoing control. The system supports six degrees of freedom (DOF) in movement, using eight thrusters to ensure stability in posture and motion. The controller, vision module, and power systems are housed in three separate watertight compartments. The core control unit consists of a Pixhawk 2.4.8 controller and an NVIDIA Jetson Orin NX, with Python used to implement inter-module communication and control logic.
    Object detection is achieved using the YOLO11 OBB model, paired with a stereo camera for distance estimation. Visual servoing control is employed to achieve precise positioning. For navigation and localization, the system integrates a Doppler Velocity Log (DVL) and dead reckoning, with a gain-scheduled (GS) PID controller used for heading, depth, and altitude control. When the vision system detects a target, the information is transmitted to the main controller via MySQL. Three independent GS-PID controllers then perform visual servo adjustments for X/Y axis position, angle, and distance, achieving high-precision control.
    Multiple experimental validations were conducted in the towing tank of the Department of Systems and Naval Mechatronic Engineering at National Cheng Kung University. In object recognition experiments, the YOLO11 OBB model achieved a mean average precision (mAP) of 0.984 at an IoU threshold of 0.5, with class recognition accuracy above 90% in the confusion matrix, indicating the model's ability to accurately predict target angles.
    In positioning control, X and Y axis errors were maintained within ±30 pixels, angle control error within ±15 degrees, and distance error around ±3 cm, demonstrating stable visual servo control. Navigation control experiments confirmed the system's ability to stably execute constant-altitude navigation. In the final integrated test, the AUVMS autonomously completed the entire mission process including altitude-controlled navigation, visual servo positioning, and object grasping and retrieval—without human intervention, demonstrating strong task execution capability and system stability.
    This research successfully developed an AUVMS capable of autonomous recognition and manipulation, proving its feasibility and practicality for complex underwater missions. The system holds significant potential for applications in ocean monitoring, equipment maintenance, and target retrieval, contributing meaningfully to advancements in underwater technology.

    摘要 I Extended Abstract II 致謝 X 目錄 XI 表目錄 XIV 圖目錄 XV 符號 XVIII 第1章 緒論 1 1-1 自主式水下載具機械手臂系統概述 1 1-2 研究動機與目的 2 1-3 文獻回顧 3 1-4 論文架構 5 第2章 AUVMS載具設計 7 2-1 載具外型設計 7 2-1-1 載具框架設計 8 2-1-2 水密艙設計 9 2-2 硬體設備 9 2-2-1 隔艙配置 10 2-2-2 電力系統架構 12 2-2-3 動力配置 13 2-2-4 控制器配置 13 2-2-5 感測器與操作設備 14 2-2-6 影像辨識設備 15 2-2-7 通訊設備 16 第3章 載具軟體架構 17 3-1 軟體核心系統 17 3-1-1 MAVLink通訊協定 17 3-1-2 ArduSub 20 3-1-3 Q Ground Control 21 3-2 資料傳輸系統 21 3-2-1 DVL定位資料傳輸 22 3-2-2 Pixhawk資料讀取 24 3-2-3 壓力感測器資料傳輸 25 3-3 MySQL資料庫 25 3-3-1 資料庫建置 26 3-3-2 MySQL資料庫讀取與傳輸 28 3-4 資料監控人機介面 30 第4章 影像辨識系統 32 4-1 深度學習 32 4-2 YOLO物件偵測模型 34 4-2-1 YOLO11 OBB 36 4-2-2 遷移學習 37 4-2-3 YOLO11 OBB模型訓練 38 4-2-4 模型性能評估 39 4-3 雙目視覺測距 42 第5章 控制系統整合 44 5-1 座標系統 44 5-2 運動方程式 45 5-3 GS-PID控制器 47 5-4 導航控制系統 49 5-4-1 航向控制器 49 5-4-2 高度控制器 49 5-4-3 深度控制器 50 5-5 視覺伺服控制系統 50 5-5-1 目標物定位控制器 51 5-5-2 目標物角度控制器 51 5-5-3 目標物距離控制器 52 第6章 實驗與測試結果 53 6-1 YOLO建模與驗證 53 6-1-1 YOLO模型訓練 53 6-1-2 混淆矩陣分析 56 6-1-3 PR曲線分析 57 6-1-4 辨識速度分析 57 6-1-5 影像辨識結果 58 6-1-6 YOLO11 OBB角度偵測 58 6-2 載具控制實驗 59 6-2-1 航向角控制實驗 59 6-2-2 定高控制實驗 63 6-2-3 定深控制實驗 64 6-2-4 路徑點追蹤 66 6-3 視覺伺服控制實驗 70 6-3-1 目標定位控制實驗 71 6-3-2 目標角度控制實驗 74 6-3-3 目標距離控制實驗 75 6-4 系統整合控制實驗 77 第7章 結論與未來展望 82 7-1 結論 82 7-2 未來展望 83 參考文獻 84

    [1] Achmad, F., A. Wiyono, E. Sulistivo, A. Kurniawan, and V. Septian, "Design and Development of AUDI'S AUV as an Underwater Robot Prototype," in 2023 6th International Seminar on Research of Information Technology and Intelligent Systems (ISRITI) , pp. 301-306: IEEE , 2023.
    [2] Ancuti, C. O., C. Ancuti, C. De Vleeschouwer, and P. Bekaert, "Color balance and fusion for underwater image enhancement," IEEE Transactions on image processing, vol. 27, no. 1, pp. 379-393, 2017.
    [3] Bai, X., Y. Wang, J. Lv, L. Cheng, S. Wang, and M. Tan, "Autonomous Free-Floating Underwater Assembly Control for an Underwater Biomimetic Vehicle-Manipulator System," IEEE Transactions on Industrial Electronics, 2025.
    [4] Bian, X., T. Jiang, T. Guo, Z. Zhang, Z. Wang, and H. Huang, "An autonomous underwater vehicle manipulator system for underwater target capturing," in 2022 IEEE International Conference on Unmanned Systems (ICUS): IEEE, pp. 1021-1026, 2022.
    [5] Cai, M., Y. Wang, S. Wang, R. Wang, L. Cheng, and M. Tan, "Prediction-based seabed terrain following control for an underwater vehicle-manipulator system," IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 51, no. 8, pp. 4751-4760, 2019.
    [6] Cai, M., Y. Wang, S. Wang, R. Wang, and M. Tan, "Autonomous manipulation of an underwater vehicle-manipulator system by a composite control scheme with disturbance estimation," IEEE Transactions on Automation Science and Engineering, vol. 21, no. 1, pp. 1012-1022, 2023.
    [7] Cai, M., Y. Wang, S. Wang, R. Wang, and M. Tan, "ROS-based depth control for hybrid-driven underwater vehicle-manipulator system," in 2019 Chinese Control Conference (CCC): IEEE, pp. 4576-4580, 2019.
    [8] Chin, C. S., A. B. H. Neo, and S. See, "Visual marine debris detection using yolov5s for autonomous underwater vehicle," in 2022 IEEE/ACIS 22nd International Conference on Computer and Information Science (ICIS): IEEE, pp. 20-24, 2022.
    [9] Cui, S., Y. Wang, S. Wang, R. Wang, W. Wang, and M. Tan, "Real-time perception and positioning for creature picking of an underwater vehicle," IEEE Transactions on Vehicular Technology, vol. 69, no. 4, pp. 3783-3792, 2020.
    [10] Duan, H. et al., "The application of AUV navigation based on cubature Kalman filter," in 2017 IEEE Underwater Technology (UT): IEEE, pp. 1-4, 2017.
    [11] Fulton, M., J. Hong, and J. Sattar, "Using monocular vision and human body priors for auvs to autonomously approach divers," in 2022 International Conference on Robotics and Automation (ICRA): IEEE, pp. 1076-1082, 2022.
    [12] Hong, L., X. Wang, D.-S. Zhang, M. Zhao, and H. Xu, "Vision-based underwater inspection with portable autonomous underwater vehicle: Development, control, and evaluation," IEEE Transactions on Intelligent Vehicles, vol. 9, no. 1, pp. 2197-2209, 2023.
    [13] Huang, H. et al., "Tightly coupled binocular vision-DVL fusion positioning feedback for real-time autonomous sea organism capture," IEEE Transactions on Instrumentation and Measurement, vol. 72, pp. 1-14, 2022.
    [14] Ji, D.-H. et al., "Redundancy analysis and motion control using zmp equation for underwater vehicle-manipulator systems," in OCEANS 2016-Shanghai: IEEE, pp. 1-6, 2016.
    [15] Jiang, C., X. Xiong, X. Xiang, Z. Zeng, and F. Zhang, "Underwater Perception and Autonomous Operation for Underwater Vehicle-Manipulator Systems: Methodology and Experiment," IEEE/ASME Transactions on Mechatronics, 2025.
    [16] Jyothi, V. B. N., S. J. Akash, G. A. Ramadass, N. Vedachalam, and H. Venkataraman, "Design and development of deep learning-aided vision guidance system for AUV homing applications," IEEE Embedded Systems Letters, vol. 16, no. 2, pp. 198-201, 2023.
    [17] Lee, A., B. Jiang, I. Zeng, and M. Aibin, "Ocean medical waste detection for cpu-based underwater remotely operated vehicles (rovs)," in 2022 IEEE 13th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON): IEEE, pp. 0385-0389, 2022.
    [18] Li, J., X. Bian, H. Huang, T. Jiang, Y. Liu, and L. Wan, "Hybrid visual servoing control for underwater vehicle manipulator systems with multiple cameras," IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 54, no. 3, pp. 1742-1754, 2023.
    [19] Londhe, P. S., M. Santhakumar, B. M. Patre, and L. M. Waghmare, "Task space control of an autonomous underwater vehicle manipulator system by robust single-input fuzzy logic control scheme," IEEE Journal of oceanic engineering, vol. 42, no. 1, pp. 13-28, 2016.
    [20] Luo, C., X. Yuan, C. Yu, and L. Zhu, "Target Recognition and Localization-based Manipulator Control," in 2023 IEEE 11th International Conference on Computer Science and Network Technology (ICCSNT): IEEE, pp. 303-308, 2023.
    [21] Palmer, E., C. Holm, and G. Hollinger, "Angler: An Autonomy Framework for Intervention Tasks with Lightweight Underwater Vehicle Manipulator Systems," in 2024 IEEE International Conference on Robotics and Automation (ICRA): IEEE, pp. 6126-6132, 2024.
    [22] Pi, R., P. Cieślak, J. Esteba, N. Palomeras, and P. Ridao, "Compliant manipulation with quasi-rigid docking for underwater structure inspection," IEEE Access, vol. 11, pp. 128957-128969, 2023.
    [23] Pi, R., P. Cieślak, P. Ridao, and P. J. Sanz, "Twinbot: Autonomous underwater cooperative transportation," IEEE Access, vol. 9, pp. 37668-37684, 2021.
    [24] Rasmussen, C., J. Zhao, D. Ferraro, and A. Trembanis, "Deep census: AUV-based scallop population monitoring," in Proceedings of the IEEE international conference on computer vision workshops, pp. 2865-2873, 2017.
    [25] Redmon, J., S. Divvala, R. Girshick, and A. Farhadi, "You only look once: Unified, real-time object detection," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 779-788, 2016.
    [26] Shao, X., B. He, J. Guo, and T. Yan, "The application of AUV navigation based on adaptive extended Kalman filter," in Oceans 2016-Shanghai: IEEE, pp. 1-4, 2016.
    [27] Simetti E.,G. Casalino, "Manipulation and transportation with cooperative underwater vehicle manipulator systems," IEEE Journal of Oceanic Engineering, vol. 42, no. 4, pp. 782-799, 2016.
    [28] Spampinato, C., Y.-H. Chen-Burger, G. Nadarajan, and R. B. Fisher, "Detecting, tracking and counting fish in low quality unconstrained underwater videos," in International Conference on Computer Vision Theory and Applications, vol. 2: SciTePress, pp. 514-519, 2008.
    [29] Taira, Y., S. Sagara, and M. Oya, "Design of a motion controller for an underwater vehicle-manipulator system with a differently-controlled vehicle," in 2018 International Conference on Information and Communication Technology Robotics (ICT-ROBOT): IEEE, pp. 1-6, 2018.
    [30] Wang, Y., M. Cai, S. Wang, X. Bai, R. Wang, and M. Tan, "Development and control of an underwater vehicle–manipulator system propelled by flexible flippers for grasping marine organisms," IEEE Transactions on Industrial Electronics, vol. 69, no. 4, pp. 3898-3908, 2021.
    [31] Wang, Y. et al., "Real-time underwater onboard vision sensing system for robotic gripping," IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1-11, 2020.
    [32] Wang, Y., Q. Tian, Y. Li, S. Li, H. Wang, and Z. Jiang, "Disturbance-observer-based Controll of the Underwater Vehicle for Manipulation," in 2024 8th International Conference on Robotics, Control and Automation (ICRCA): IEEE, pp. 168-172, 2024.
    [33] Wang, Y., R. Wang, S. Wang, M. Tan, and J. Yu, "Underwater bioinspired propulsion: From inspection to manipulation," IEEE Transactions on Industrial Electronics, vol. 67, no. 9, pp. 7629-7638, 2019.
    [34] Wu, T.-H., S.-M. Wang, and W.-W. Chang, "Design and Image Tracking Control of Underwater Manipulator," in 2023 5th International Conference on Control and Robotics (ICCR): IEEE, pp. 210-214, 2023.
    [35] Xu, B., S. Abe, N. Sakagami, and S. R. Pandian, "Robust nonlinear controller for underwater vehicle-manipulator systems," in Proceedings, 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.: IEEE, pp. 711-716, 2005.
    [36] Yang, J., J. P. Wilson, and S. Gupta, "Dare: Diver action recognition encoder for underwater human–robot interaction," IEEE Access, vol. 11, pp. 76926-76940, 2023.
    [37] Zhang, Y., X. Bian, Y. Sun, Z. Zhang, H. Huang, and X. Yang, "Line Feature Based Visual Servo Control with TDE for Underwater Vehicle Manipulator Systems Target Grasping," in 2024 IEEE International Conference on Unmanned Systems (ICUS): IEEE, pp. 242-249, 2024.
    [38] Zhang, Z., C. Wang, Q. Zhang, Y. Li, X. Feng, and Y. Wang, "Research on autonomous grasping control of underwater manipulator based on visual servo," in 2019 Chinese Automation Congress (CAC): IEEE, pp. 2904-2910, 2019.
    [39] " Houston Mechatronics Aquanaut." Meet Aquanaut, the Underwater Transformer - IEEE Spectrum (accessed April 7, 2025).
    [40] " Blue Robotics BlueROV2 Heavy." https://bluerobotics.com/store/rov /bluerov2-upgrade-kits/brov2-heavy-retro20fit/ (accessed April 1, 2025).
    [41] " LoCO-AUV."https://deepai.org/publication/design-and-experiments-with-loco-auv-a-low-cost-open-source-autonomous-underwater-vehicle (accessed April 1, 2025).
    [42] " MAVLink 資料結構." https://mavlink.io/zh/guide/serialization.html (accessed April 10, 2025).
    [43] " Pixhawk Output." https://ardupilot.org/copter/docs/common-pixhawk-overview.html (accessed April 18, 2025).
    [44] " MAVLink資料讀取格式." https://mavlink.io/en/messages/common.html (accessed May 21, 2025).
    [45] " YOLO11 OBB." https://docs.ultralytics.com/zh/tasks/obb/#models (accessed May 1, 2025).
    [46] "YOLO Performance ." https://github.com/ultralytics/ultralytics? tab=readme-ov-file (accessed April 27, 2025).

    下載圖示 校內:立即公開
    校外:立即公開
    QR CODE