| 研究生: |
張文威 Chang, Wen-Wei |
|---|---|
| 論文名稱: |
應用影像辨識導引UVMS於水下目標物抓取之研究 Research on Underwater Object Capturing Using Image Recognition to Guide UVMS |
| 指導教授: |
王舜民
Wang, Shun-Min |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 系統及船舶機電工程學系 Department of Systems and Naval Mechatronic Engineering |
| 論文出版年: | 2024 |
| 畢業學年度: | 112 |
| 語文別: | 中文 |
| 論文頁數: | 103 |
| 中文關鍵詞: | UVMS 、影像辨識 、深度學習 、雙目立體視覺 、視覺伺服控制 |
| 外文關鍵詞: | UVMS, Image recognition, Deep learning, Stereo vision, Visual servo control |
| 相關次數: | 點閱:73 下載:16 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著科技日新月異,水下載具於深海環境中的應用正逐漸擴展,成為探索、監測及維修等任務之重要工具。本研究自行開發了一套無人水下載具機械手臂系統(Underwater Vehicle Manipulator System, UVMS),可進行水下目標物抓取任務。本研究之 UVMS 整體框架採用高密度聚乙烯(HDPE)製成,搭載單一自由度的機械手臂,並於手臂上方架設水下雙目攝影機可拍攝水下即時影像,進行目標物追蹤。然而,水下環境的多變影響影像特徵提取,導致偵測的準確度降低,為提高水下目標物之偵測準確度,本研究採用深度學習的方式進行影像特徵提取,基於卷積神經網路(Convolutional Neural Network, CNN)架構之 YOLO 物件偵測模型來提取水下目標物的特徵,並結合雙目立體視覺技術進行水下目標物追蹤定位,當取得了影像特徵與視深距離後,本研究採用基於影像的視覺伺服控制方法,透過即時影像的目標物特徵與視深距離資訊做為控制器的回饋訊號,將UVMS導引至目標物前方。為驗證此方法之可行性,本研究於國立成功大學系統及船舶機電工程學系穩定性能水槽進行測試,於水質清澈的場域下可實現距離目標物兩公尺的自主視覺導引追蹤任務,並抓取水下目標物。
As technology continues to advance, the application of underwater vehicles in deep-sea environments is expanding, becoming essential tools for exploration, monitoring, and maintenance tasks. This research developed an underwater vehicle manipulator system (UVMS) designed to perform underwater object grasping tasks. The overall framework of the UVMS is made of high-density polyethylene (HDPE) and is equipped with a single degree-of-freedom manipulator. An underwater stereo camera is mounted above the arm to capture real-time underwater images for object tracking. However, the variability of the underwater environment affects image feature extraction, resulting in reduced detection accuracy. To improve the accuracy of underwater object detection, this research uses a deep learning approach for image feature extraction. The YOLO object detection model, based on a convolutional neural network (CNN) architecture, is employed to extract features of underwater objects and combined with stereo vision technology for underwater object tracking and localization. Once the image features and stereo distance are obtained, this research employs an image-based visual servo (IBVS) control method. The object features and stereo distance information from real-time images serve as feedback signals for the controller, guiding the UVMS to the front of the object. To verify the feasibility of this method, tests were conducted in the stability performance tank of the Department of Systems and Naval Mechatronic Engineering at National Cheng Kung University. Under clear water conditions, the system can autonomously guide and track targets within a two-meter distance and grasp underwater objects.
[1] Bian, X., T. Jiang, T. Guo, Z. Zhang, Z. Wang, and H. Huang, "An autonomous underwater vehicle manipulator system for underwater target capturing," in 2022 IEEE International Conference on Unmanned Systems (ICUS), pp. 1021-1026, 2022.
[2] Cai, M., Y. Wang, S. Wang, R. Wang, L. Cheng, and M. Tan, "Prediction-based seabed terrain following control for an underwater vehicle-manipulator system," IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 51, no. 8, pp. 4751-4760, 2019.
[3] Cao, Z. et al., "Image dynamics-based visual servoing for quadrotors tracking a target with a nonlinear trajectory observer," IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 50, no. 1, pp. 376-384, 2017.
[4] Cui, S., Y. Wang, S. Wang, R. Wang, W. Wang, and M. Tan, "Real-time perception and positioning for creature picking of an underwater vehicle," IEEE Transactions on Vehicular Technology, vol. 69, no. 4, pp. 3783-3792, 2020.
[5] Ercan, M. F., N. I. Muhammad, and M. R. N. B. Sirhan, "Underwater target detection using deep learning," in TENCON 2022-2022 IEEE Region 10 Conference (TENCON), pp. 1-5, 2022.
[6] Guo, Y., Y. Liu, A. Oerlemans, S. Lao, S. Wu, and M. S. Lew, "Deep learning for visual understanding: A review," Neurocomputing, vol. 187, pp. 27-48, 2016.
[7] Ke, F., Z. Li, H. Xiao, and X. Zhang, "Visual servoing of constrained mobile robots based on model predictive control," IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 47, no. 7, pp. 1428-1438, 2016.
[8] Lee, D., G. Kim, D. Kim, H. Myung, and H.-T. Choi, "Vision-based object detection and tracking for autonomous navigation of underwater robots," Ocean Engineering, vol. 48, pp. 59-68, 2012.
[9] Li, Y., K. Sun, and Z. Han, "Vision Technology in Underwater: Applications, Challenges and Perspectives," in 2022 4th International Conference on Control and Robotics (ICCR), pp. 369-378, 2022.
[10] Luo, C., X. Yuan, C. Yu, and L. Zhu, "Target Recognition and Localization-based Manipulator Control," in 2023 IEEE 11th International Conference on Computer Science and Network Technology (ICCSNT), pp. 303-308, 2023.
[11] Martins, A., J. Almeida, C. Almeida, B. Matias, S. Kapusniak, and E. Silva, "EVA a hybrid ROV/AUV for underwater mining operations support," in 2018 OCEANS-MTS/IEEE Kobe Techno-Oceans (OTO), pp. 1-7, 2018.
[12] Rasmussen, C., J. Zhao, D. Ferraro, and A. Trembanis, "Deep census: AUV-based scallop population monitoring," in Proceedings of the IEEE international conference on computer vision workshops, pp. 2865-2873, 2017.
[13] Redmon, J., S. Divvala, R. Girshick, and A. Farhadi, "You only look once: Unified, real-time object detection," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 779-788, 2016.
[14] Reis, D., J. Kupec, J. Hong, and A. Daoudi, "Real-time flying object detection with YOLOv8," arXiv preprint arXiv:2305.09972, 2023.
[15] Rizzini, D. L., F. Kallasi, J. Aleotti, F. Oleari, and S. Caselli, "Integration of a stereo vision system into an autonomous underwater vehicle for pipe manipulation tasks," Computers & Electrical Engineering, vol. 58, pp. 560-571, 2017.
[16] Rizzini, D. L., F. Kallasi, F. Oleari, and S. Caselli, "Investigation of vision-based underwater object detection with multiple datasets," International Journal of Advanced Robotic Systems, vol. 12, no. 6, p. 77, 2015.
[17] Shoucheng, X., "Research on target recognition and localization based on binocular vision and robotic arm grasping [D]," South China University of Technology, 2020.
[18] Spampinato, C., Y.-H. Chen-Burger, G. Nadarajan, and R. B. Fisher, "Detecting, tracking and counting fish in low quality unconstrained underwater videos," in International Conference on Computer Vision Theory and Applications, vol. 2: SciTePress, pp. 514-519, 2008.
[19] Walther, D., D. R. Edgington, and C. Koch, "Detection and tracking of objects in underwater video," in Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004., vol. 1, pp. I-I, 2004.
[20] Wang, Y., R. Wang, S. Wang, M. Tan, and J. Yu, "Underwater bioinspired propulsion: From inspection to manipulation," IEEE Transactions on Industrial Electronics, vol. 67, no. 9, pp. 7629-7638, 2019.
[21] Wu, T.-H., S.-M. Wang, and W.-W. Chang, "Design and Image Tracking Control of Underwater Manipulator," in 2023 5th International Conference on Control and Robotics (ICCR), pp. 210-214, 2023.
[22] Xia, W., "Vision-based robot arm grasping system for service robots [D]," Heilongjiang University, 2021.
[23] Zhang, Q., A. Zhang, P. Gong, and W. Quan, "Research on autonomous grasping of an UVMS with model-known object based on monocular visual system," in ISOPE International Ocean and Polar Engineering Conference, ISOPE, pp. ISOPE-I-10-296, 2010.
[24] Zhang, Z., "A flexible new technique for camera calibration," IEEE Transactions on pattern analysis and machine intelligence, vol. 22, no. 11, pp. 1330-1334, 2000.
[25] Zhao, Y., "Research on control method of underwater manipulator based on vision guidance [D]," Harbin Engineering University, 2019.
[26] "H2000."https://ocean-innovations.net/companies/doer-marine/rovs/h2000/ (accessed June 22, 2024).
[27] "Girona 500." https://iquarobotics.com/contact (accessed June 22, 2024).
[28] "BlueROV2." https://bluerobotics.com/store/rov/bluerov2/ (accessed June 25, 2024).
[29] "Ultralytics." https://github.com/ultralytics/ultralytics (accessed June 30, 2024).
[30] "Roboflow." https://roboflow.com/ (accessed June 4, 2024).