研究生: |
李玟槿 Li, Wen-Gin |
---|---|
論文名稱: |
基於雙目視覺之移動機器人自動抓取系統設計 Automated Grasping System for Mobile Robot Based on Binocular Vision |
指導教授: |
侯廷偉
Hou, Ting-Wei |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 工程科學系 Department of Engineering Science |
論文出版年: | 2023 |
畢業學年度: | 111 |
語文別: | 中文 |
論文頁數: | 37 |
中文關鍵詞: | 雙目視覺 、物件偵測 、深度學習 、移動機器人 、機械手臂 、自動抓取 |
外文關鍵詞: | binocular vision, object detection, deep learning, mobile robot, robotic arm, automated grasping |
相關次數: | 點閱:92 下載:22 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
自動搜尋並抓取物品是家用機器人的一項重要功能,但目前商業上能做到這件事的機器人價格昂貴,無法在一般家庭中普及。本論文之家用機器人由移動載具、低價位六軸機械手臂以及兩組雙目鏡頭組成。機器人系統透過雙目鏡頭與OpenCV函式庫進行相機標定與讀取目標影像,並以雙目視覺及物件偵測模型為基礎實作導航模組與自動抓取模組。
導航模組計算出目標相對於機器人的深度距離以及方向角,以此控制機器人移動至目標前方,並透過目標的實際大小判斷機械手臂的初始抓取姿態。自動抓取模組會計算出目標與機器人之間的深度距離以及偏移量,同時根據初始抓取姿態的不同而有對應的補償方法,使夾爪能對準並成功抓取目標。最後透過實驗證明本論文設計之機器人系統可以實現導航與自動抓取功能,並且初始抓取姿態的設置使機器人能從不同的方向進行2D平面抓取,增加自動抓取功能的靈活性。
Automatic searching and grasping of items are important function of household robots. However, robots capable of performing this task in the commercial market are currently expensive, preventing widespread adoption in regular households. This study proposes a robot system comprising a mobile platform, a low cost six-axis robotic arm, and two sets of stereo cameras. The robot system utilizes stereo cameras and the OpenCV library to perform camera calibration and capture target images. Building upon stereo vision and object detection model, this study implements navigation and automated grasping modules.
The navigation module calculates the target's depth distance and direction angle relative to the robot using stereo vision. This information guides the robot to move in front of the target. Based on the actual size of the target, the initial grasping pose of the robotic arm is determined. The automated grasping module calculates the depth distance and offset between the target and the robot. Additionally, compensation methods corresponding to the different initial grasping poses are applied, allowing the gripper to align with and successfully grasp the target. Finally, experimental results demonstrates that the robot system designed in this study can achieve navigation and automated grasping functionalities. The configuration of the initial grasping pose enables the robot to perform 2D plane grasping from various directions, enhancing the flexibility of the automated grasping capability.
A. Zhang, "A design of the housekeeping robot and its control strategies," in 2021 2nd International Conference on Computing and Data Science (CDS), Stanford, CA, USA, pp. 249-254, 2021.
R. Mur-Artal, J. M. M. Montiel and J. D. Tardós, "ORB-SLAM: a versatile and accurate monocular SLAM system," IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147 - 1163, 2015.
K. Okada, M. Inaba and H. Inoue, "Integration of real-time binocular stereo vision and whole body information for dynamic walking navigation of humanoid robot," in Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003., Tokyo, Japan, pp. 131-136, 2003.
H. Li, Y.-L. Chen, Tianhai Chang, X. Wu, Y. Ou and Y. Xu, "Binocular vision positioning for robot grasping," in 2011 IEEE International Conference on Robotics and Biomimetics, Karon Beach, Thailand, pp. 1522-1527, 2011.
K. A. S.M.ASCE, V. R. Haritsa, K. Han and J.-P. Ore, "Automated object manipulation using vision-based mobile robotic system for construction applications," Journal of Computing in Civil Engineering, vol. 35, no. 1, p. 04020058, 2021.
M. Labbé and F. Michaud, "RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation," Journal of Field Robotics, vol. 36, no. 2, pp. 416-446, 2019.
K. Asadi, P. Chen, K. Han, T. Wu and E. Lobaton, "LNSNet: lightweight navigable space segmentation for autonomous robots on construction sites," Data Sensing and Analysis in Design, Construction, Operation, Monitoring, and Maintenance of Built Environments,vol. 4, no. 1, 40, 2019.
Y. Ma, W. Zhu and Y. Zhou , "Automatic grasping control of mobile robot based on monocular vision," The International Journal of Advanced Manufacturing Technology volume, pp. 1785-1798, 3 June 2022.
Z. Li, B. Xu, D. Wu, K. Zhao, M. Lu and J. Cong, "A mobile robotic arm grasping system with autonomous navigation and object detection," in 2021 International Conference on Control, Automation and Information Sciences (ICCAIS), Xi'an, China, pp. 543-548, 2021.
W. Hess, D. Kohler, H. Rapp and D. Andor, "Real-time loop closure in 2D LIDAR SLAM," in 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, pp. 1271-1278, 2016.
A. Bochkovskiy, C.-Y. Wang and H.-Y. M. Liao, "Yolov4: Optimal speed and accuracy of object detection," in 2020 arXiv preprint arXiv:2004.10934, 2020.
J. Bohren, R. B. Rusu, E. G. Jones, E. Marder-Eppstein, C. Pantofaru, M. Wise, L. Mösenlechner, W. Meeussen and S. Holzer, "Towards autonomous robotic butlers: Lessons learned with the PR2," in 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 5569-5575, 2011.
T. Yamamoto, K. Terada, A. Ochiai, F. Saito, Y. Asahara and K. Murase, "Development of human support robot as the research platform of a domestic mobile manipulator," ROBOMECH journal, vol. 6, no. 1, pp. 1-15, 2019.
Samsung, [Online]. Available: https://research.samsung.com/news/-CES-2022-Samsung-Research-New-Tech-Trio-Samsung-Bot-Handy-Housework-robot. [Accessed 18 Aug 2023].
黃榆哲, 具有移動於地面及桌面進行抓取任務之雙機器人系統設計與實作, 國立成功大學工程科學系碩士論文, 2023.
林聖源, 基於立體視覺及深度學習之人員跟隨及避障系統設計, 國立成功大學工程科學系碩士論文, 2022.
H. Hirschmuller, "Stereo processing by semiglobal matching and mutual information," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 2, pp. 328 - 341, 2007.
Ultralytics, "ultralytics/yolov5: YOLOv5 in PyTorch > ONNX > CoreML - GitHub," [Online]. Available: https://github.com/ultralytics/yolov5. [Accessed 18 Aug 2023].
Meta AI, "PyTorch," [Online]. Available: https://pytorch.org/. [Accessed 18 Aug 2023].